Emerging Technologies, Governance of Non-Knowledge
On June 23, 2011, the American-German Institute (AGI) hosted a seminar with DAAD/AGI Fellow Dr. Sascha Dickel on “Emerging Technologies, the Governance of Non-Knowledge, and the Return of Politics: The Case of Synthetic Biology Regulation.” The seminar was generously supported by the German Academic Exchange Service (DAAD).
In this seminar, Dr. Sascha Dickel expressed the sentiment that inventions stemming from the fields of nano-, bio-, information, and neuro-technology will present sizeable challenges to society, culture, and politics. Therefore, the ethics commissions that have been formed in countries around the world over the past two decades will play a major role in advising governments of the potential benefits and dangers of new technologies. The course of action taken by these commissions in the United States and Europe with regard to advising on issues of emerging technologies was the focus of Dr. Dickel’s presentation.
The main function of these ethics commissions is to provide information to policy and decision-makers about the potential ethical issues that could arise from new scientific inventions. While each commission only serves as an advisory board to the government they work under, it is difficult for policymakers to ignore the results of a commission’s findings. For this reason, governments tend to appoint boards that share similar values to those of the leaders in power. Such a precaution tends to spell the end of an ethics council when a new administration takes office, as was the case under Presidents George W. Bush and Barack Obama.
Following a 16 December 2010 report of President Obama’s Presidential Commission for the Study of Bioethical Issues regarding the creation of the first synthetic life form, critics began to claim that the commission’s methods were too laissez-faire. The report, it was argued, did not properly address the possible dangers of emerging technologies and could simply give Washington the ability to approve all technologies as they came along. The report also sparked a debate as to what truly constitutes responsible governance of emerging technologies.
According to Dr. Dickel, the attention paid to synthetic biology is nothing new, and in fact various discoveries across the fields of emerging technologies have gone through cycles of hopes, fears, and disappointments for some time. It is synthetic biology, however, that appears to be the new hope for emerging technologies, as advances in this field can build upon previous mistakes and function exactly as they are intended to. These new discoveries can lead to cleaner energy, better medical products, and even improved agricultural production. On the other hand, advances in synthetic biology may also be more difficult to control, as their ability to evolve leads to new concerns.
As with any new discovery, there are certain known inherent benefits and dangers that could arise, as well as those that can be expected but not necessarily specified. However, synthetic biology may pose an even greater risk: hazards that cannot even be speculated about but could emerge as research evolves. It is in this last area, also known as the “unknown unknowns,” where the real challenges lie in regulating emerging technologies, and where scientific data alone will not be sufficient.
One generally accepted approach to managing this uncertainty is to follow the “Precautionary Principle,” which stems from the German Vorsorgeprinzip. According to this principle, if an action such as scientific research is expected to create a danger to society, but no scientific consensus about this danger is present, then those carrying out the action must prove that it poses no hazards. This principle is widely used in EU policy and is a central aspect of the European Group on Ethics in Science and New Technology’s (EGE) approach to synthetic biology. While the EGE claims that the long term effects of an emerging technology must be thoroughly examined in line with the precautionary principle before its commercial release, many argue this method poses unnecessary constraints on scientific progress. Critics claim that while unexpected dangers may be present, many life changing inventions, like the airplane or modern antibiotics, would never have passed such scrutiny. In this sense, the precautionary principle itself is a danger to society, as it may result in blocking the creation of a useful advancement in medicine or agriculture.
In response to such claims against the precautionary principle, advocates of emerging technologies say that a “pro-action” approach should be taken, one in which the burden of proof lies with those taking restrictive measures. According to the Obama administration’s commission, a more evolutionary method of analyzing emerging technologies should be the focus. Under this system, scientists should be required to obtain some form of ethics education so that the scientific community can be more self-regulated, i.e., less influence from policymakers on emerging technologies. An independent institution, such as FactCheck.org, could be created as a forum for members of the scientific community and the general public to discuss new scientific claims.
In the end, it seems that the approaches toward emerging technologies of both the EU and the U.S. are simply based on different interpretations of the precautionary principle. While the EGE analyzes new advances within the framework of the precautionary principle, the U.S. rejects such an “extreme precautionary approach.” However, the differences between both the U.S. and EGE approach are not that great, as the EGE method is not the “zero risk” way the U.S. makes it out to be. Interestingly, Germany will be holding a national ethics council discussion on synthetic biology in November of this year. Here, we may see yet another varying interpretation of the precautionary principle approach. In any case, such diverging methods of examining emerging technologies shows how the world is increasingly becoming a laboratory in which many still feel uneasy about the future.