Can we have "some" regulation of nanotech?

from the strategies-for-survival dept.
ChrisPhoenix writes "Human societies have felt the need to regulate, or try to regulate, many different kinds of technologies. All of these technologies have been far less powerful than a mature nanotechnology. Is regulation of nanotech a good idea? If so, what form could it take? If not, is it preventable? Is limited, effective regulation a possibility?"

Read more for the rest of Chris's essay and invitation to discussion. ChrisPhoenix writes "Human societies have felt the need to regulate, or try to regulate, many different kinds of technologies. All of these technologies have been far less powerful than a mature nanotechnology. Is regulation of nanotech a good idea? If so, what form could it take? If not, is it preventable? Is limited, effective regulation a possibility?

Attempts to regulate technology have usually been unsuccessful in the long term. China couldn't keep out opium, Japan eventually modernized when Perry made a persuasive case, the U.S. war on drugs has probably only increased the market and has certainly strengthened the criminal element, and India and Pakistan both recently detonated nuclear weapons. On the other hand, the increasing rate of change of technology means that any specific regulation will be obsolete within a decade; it is possible that a temporary regulation might do more good than harm.

A completely unregulated nanotech would pose several severe problems. It could easily lead to an uncontrolled and unstable arms race, as argued by Mark Gubrud here. It would increase the personal availability of regulated technologies, including drugs, weapons, and surveillance. It might well destabilize economies, including destroying the basis of international trade. And it would translate all of the problems of the computer revolution–crackers, viruses, IP conflicts–into the physical realm. (If your web site is hacked, you're dead. If you go outdoors without protective clothing, any script kiddie can kill you either deliberately or randomly. How many people today go on-line with Windows and no firewall?) Attempts to deal with these problems piecemeal would likely fail in destructive ways–but the attempts would certainly be made. This scenario seems undesirable.

A total regulation of nanotech would probably require the forcible destruction of the world's technology base, to prevent independent development efforts, or else pervasive and invasive surveillance of all humans and all activities, with special emphasis on AI-capable systems. Technology would be doled out from a central source, and all humans would live under an absolute dictatorship. This option also seems undesirable.

If total freedom and total regulation are both undesirable, is there any middle path that is preferable and plausible? Total safety is probably unrealistic, but is there a way to defang the run-of-the-mill script kiddies, virus writers, Mafia hitmen, terrorists, and suicidal-destructive teenagers without applying enough regulation to create a black market, a revolt, or a totalitarian nightmare?

I propose several starting assumptions.
1) Most humans will not take sensible individual precautions, due to complacency, lack of information, lack of resources, etc. So we can't count on individual acts to mitigate the danger to individuals.

2) Most organizations have only limited responsibility–often more limited than we'd like to think. Governments only care about their first-class citizens (and indirectly, their second-class citizens). Corporations only care about their stockholders (and indirectly, their future potential customers). Idealists frequently don't care about anyone at all, preferring to follow their ideal and hang the consequences.

3) Most organizations will try to grow beyond the limits of what is best for people as a whole.

4) A black market will provide a strong incentive for the development of unregulated nanotech.

5) A one-year lead in technology, post-assembler, is enough to make the laggards mostly harmless. I'm not sure of this one, since it probably depends not only on R&D but also on deployment, and deployment could be limited for several reasons including political, economic, and ideological.

6) Nanotech will allow the construction of a wide variety of weapons that an unmodified human cannot effectively defend against.

7) At least on and near this planet, some resources (land, heat pollution credits, open/untouched space, sacred sites) will continue to be scarce.

8) Nanotech will provide new choices and bring many existing choices into sharper focus, thus fueling moral dilemmas and profoundly challenging intolerant moral systems.

I have some ideas for an organization and system that can hope to deal with all of these constraints, but I want to hear other opinions first. In thinking about these problems, I have found Jane Jacobs' work (as extended by Pat Gratton and Tom McKendree here) to be very helpful. Note that regulation of nanotech will be the topic of the Nanoschmooze tomorrow.

Chris"

The Foresight Guidelines on Molecular Nanotechnology provide one reference for considering how to deal with these challenges. Engines of Creation earlier reviewed strategies for dealing with dangers from nanotech.

Leave a comment

0
    0
    Your Cart
    Your cart is emptyReturn to Shop