Book Review of A Dangerous Master
Book Review of A Dangerous Master
Technologies such as 3D printing, synthetic biology, genomics, nanotechnology, and artificial intelligence can pose risks to humanity and the environment. That is the central message of A Dangerous Master: How to Keep Technology from Slipping Beyond Our Control (Basic Books, 2015.) In this volume, Wendell Wallach, a scholar with Yale University's Interdisciplinary Center for Bioethics, argues that we need a more informed process for the development and control of these technologies.
That science and technology could lead to different futures, some desirable, others less so, is not groundbreaking news. However, it is in everyone's best interest to understand the full range of scenariosâ??possible, probable, and preferredâ??that could emerge from these technologies.
In Dr. Wallach's view, this requires weighing the risks of technologies, not simply the benefits. It further involves managing the pace at which technologies develop. At one extreme, we could have a Wild West of unrestrained development. At the other, citizens may elect to impose a moratorium on using a technology.
In the real world, society has typically chosen one of two paths for risk assessment. The European Union has utilized the precautionary principle, which places the burden on companies to prove that their technology is safe.
In contrast, the United States utilizes the "proactionary" principle, which places the burden on activists to prove that a technology is harmful and balances any drawbacks against the expected benefits. For example, the President's Bioethics Commission chose a course of "prudent vigilance" as opposed to new regulations or a moratorium for the development of synthetic biology.
Wallach believes that society can manage risks more efficiently by recognizing inflection points where a change of course is possible.
Rather than accepting technological determinism, like those who believe in the coming "Singularity," he advocates solutions molded by societal values, adjusting our course as needed.
Wallach believes the president's commission missed an opportunity to regulate synthetic biology and that the US government has failed to provide effective oversight over the development of nanotechnology.
However, the author's argument assumes that more caution, delaying the benefits in favor of more analysis and testing, will achieve superior results. This is not necessarily true.
Witness, for example, the pharmaceutical industry. It relies on extensive clinical trials to demonstrate the safety and efficacy of new drugs, a practice so costly that it would prohibit the development of most other commercial products. In spite of this process, a study reported in the Journal of the American Medical Association study reveals that roughly 106,000 people die each year in American hospitals from the side effects of medication. We would never suggest sending untested drugs into the marketplace. Yet it is clear that caution does not guarantee the safety Dr. Wallach seeks.
Wallach asks whether humanity as a whole is intelligent enough to decide how to regulate technology? Even if we do regulate it, do we make rational choices using these regulations?
As a technology analyst, I believe many people are holding the development of nanotechnology and synthetic biology to a different standard than other technologies.
Like genetic engineering, the fields of nanotechnology and synthetic biology are largely self-governed by experts in these disciplines. Fatalities in any of them are virtually unknown. Meanwhile, automobiles and pharmaceuticals are heavily regulated, but human fatalities are common and tolerated in those industries.
The automobile industry and Americans have a social contract based on cost-benefit analysis. According to the National Highway Traffic Safety Administration, over the last twenty years Americans have accepted over 40,000 traffic fatalities annually in return for a convenience that is engrained as part of our lifestyle.
As Wallach speculates in his first book, Moral Machines: Teaching Robots Right From Wrong, "Would people have stopped the development of cars? Probably not. Most people believe that the advantages of cars outweigh the destructive potential."
All Public Comments
© 2013-2017 TechCast Global Inc Printed By: Dec 10, 2018 For personal use only