from the survival-matters dept.
Chris Phoenix's essay "Can we have "some" regulation of nanotech?" has generated a lot of good discussion. Since the original post has now slipped off the front page of Nanodot, this post is made to encourage continued discussion. Click here to go to the discussion, or on Read More below for an overview of the discussion so far.
Chris Phoenix's essay "Can we have "some" regulation of nanotech?" has generated a lot of good discussion. Since the original post has now slipped off the front page of Nanodot, this post is made to encourage continued discussion. Click here for the comments, or on Read More below for an overview of the discussion so far.
All the participants deserve congratulations for their thoughtful, courteous and constructive posts on one of the major issues in developing nanotech. Here is a brief overview of the discussion so far. (Recent comments are in red.)
Chris opened the discussion by asking what form regulation of nanotech might take. He pointed out the difficulty of succesful regulation, but suggested that "temporary regulation might do more good than harm." Unregulated nanotech risks arms races and other dangers. He asked if there is a middle path between total regulation and total freedom. He then proposed several starting assumptions for discussion, concerning individuals, organizations, black markets, lead time in technology, nanoweapons, scarcity and moral dilemmas.
I suggested that regulating peaceful use of nanotech seems feasible but that finding a synthesis of strategies for "Avoiding Arms Races" and nanowar posed major challenges.
In "Development Paths" Robert Bradbury stressed evolution and optimization of moral actors in the long term perspective, suggesting the debate could be rephrased to ask "Are there principles that guide the substitution of self-directed (intelligence supporting/moral?) evolution for random (amoral?) evolution of MNT?" Chris replied that he disagreed with "nearly everything" in Robert's post, noting the importance of emergent behaviors in groups, memes, and how people do not always act as perfectly rational agents. Mark Gubrud questioned the definition of "human" being used, "sub-optimality" and trying to derive a target for evolution from "limits imposed by physics." Robert clarified his concerns about values in resource allocation and decisions about deleting information.
Louis Kammerer asked if assembler regulation was possible since "All it would take is one person to give the world access." jbash said that "I think there's no question that.. there will be some regulation." but regulation might be incoherent, competing and not so likely to succeed.
Robert argued that Regulation is impossible since "Verification requires complete disassembly!" Given miniaturization and how "information begets hardware" the solutions must be found in robust defenses and education. Butler predicted that "Regulation" [is] inevitable though regulation unlikely. Laws will be passed but enforced selectively.
Chris Weider noted that "Threats *must* be at least conceptualized before defenses can be created … and b) education will take a much longer time to eliminate threats than almost everyone thinking about this space thinks we have." Robert replied that education needs to support the paradigm shift to understand that "everyone can get rich" and the necessity of defensive systems.
Chris Phoenix detailed how Short term regulation may be better by allowing time to prepare defense, education, surveillance, social systems and "everything else that we will need to survive long-term." Robert replied at length about clarifying time frames, being specific about what and whom to regulate, open source international development of defenses and costs of regulation.
Mark discussed how even imperfect regulation can avoid arms races and stressed the need to distinguish between "symmetric" and "asymmetric" threats. Chris agreed on the importance of distinguishing between military and rogue threats and other points, and raised a technical point about detecting inclusions. Mark challenged the feasibility of universal surveillance.
Kadamose said It doesn't matter "Because no matter what people do to try and 'regulate' the technology, there will always be 'leaks' and 'gaps'" and that "it will only take one 'pirate' to get his/her hands on an assembler to change the entire world overnight."
Mark commented that most products are already regulated, relying on a technological lead is risky, and that abundance offers opportunities to reduce incentives for criminality and change subjective ideas about "scarcity." Robert discussed in "Balancing regulation" why the difficulties in controlling nanotech mean that education deserves more emphasis than regulation. Chris clarified his ideas and questions on black markets, fairness, sacred sites and technological lead, noted his new comment in the Jane Jacobs discussion and invited further comment on the specific points in his essay. Chris reiterated that regulation could "buy time" and asked how education could deal with with corporations and with "non-rational problems" such as religion.
And here for your reading and comments is a link to the essay that begins the discussion: "Can we have "some" regulation of nanotech?"
Both Chris Phoenix and I plan to post future Nanodot articles on these issues. Further submissionsare welcome.