MIT's Dertouzos replies to Bill Joy

from the relinquishment-regarded-as-harmful dept.
MIT computer scientist Michael Dertouzos responds to Bill Joy in Technology Review , on the call for relinquishment. Excerpts: I donít buy it…So limited is our ability to assess consequences that itís not even helped by hindsight: On balance, are cars a good or bad thing for society?…We are unable to judge whether something we invented more than 50 years ago is good or bad for us today. Yet Joy wants us to make these judgments prospectively, to determine which technologies we should forgo!…Just because chips and machines are getting faster doesnít mean theyíll get smarter, let alone lead to self-replication…Should we stop computer science and AI research in the belief that intelligent machines someday will reproduce themselves and surpass us? I say no. We should wait to find out whether the potential dangers are supported by more than our imagination…We shouldnít forget that what we do as human beings is part of nature.

Ethical systems: Guardian, Commercial, Idealist

from the keeping-track-of-our-biases dept.
Senior Associate PatGratton writes "On a sociological/ethical note… Around the time of the Fall Foresight Gathering, I was reading Jane Jacobs' Systems of Survival and came up with some interesting applications of her ideas to people interested in transformational technologies….When I tried to apply Jacobs' categories to the attendees of the Foresight Gathering, I quickly reached two conclusions: 1) there are virtually no Guardians present within the Foresight membership, and 2) Jacobs missed a syndrome….I contend that the Foresight community is split between Idealists and Traders, and that this leads to a certain amount of unavoidable conflict….Because Guardians are underrepresented within Foresight, Foresight discussions are likely to strongly biased towards Commercial and Idealist views and solutions. More importantly, we're likely to fail to address or to take seriously concerns that would come naturally to a Guardian. This in turn implies that we're likely be underprepared when we take our ideas/solutions to the general public…" Read More for Pat's full post.

Jaron Lanier Takes On "Cybernetic Totalists"

from the una-bummer dept.
SaiyajinTrunks writes "Jaron Lanier has made available what he calls 'One Half of a Manifesto' on the online publication, Edge. It's fourteen courses of good food for thought with a dessert of reactions from big names in the field." (Additional discussion of Jaron's hemi-festo can be found on Slashdot.)

[I might uncharitably summarize Jaron's argument as "I am right and virtuous, and you are all evil and deluded. Q.E.D." Years ago Jaron told me my "extropianism" was "contaminated by compassion and humility". He seems to still be of the belief that posthumanists must therefore hate humanity. — dk]

Critique of Josh Hall's 'Ethics for Machines'

from the major-disagreement dept.
Senior Associate Peter Voss writes "Josh Hallís Ethics for Machines suffers many of the problems endemic to moral debate: vague and shifting definitions, confusion over ëdutyí, rejecting the possibility of a rationally derived morality, and confusing description and prescription. Specifically, it fails to clearly define, or justify, its implied meta-ethical goal of ëgroup dynamismí. Other core problems are: its mischaracterization of ëethical instinctí, its condemnation of self-interest and common sense, and its failure to recognize the importance of high-level intelligence and consciousness to morality. Ethics for Transhumans addresses these points, and sketches an alternative path."

Alarms about Techno-Utopianism

from the utopian-dystopian-or-atopian? dept.
Senior Associate BryanBruns writes "Reason magazine has a story "Dystopian Fearmongers Strike Again" criticizing the "TechnoUtopian" advertisement recently run in the New York Times. (The ad is available online: technoad.pdf.) The advertisement has three paragraphs on nanotechnology, with reasonably accurate content, using nanotech as another example of technological optimism. The section on nanotech finishes by saying "[Bill] Joy has grave doubts about proceeding, citing dangers from escaping self-replicating nanomachines, and from military applications. (There are also terribly frightening surveillance and privacy concerns.) So far, Joy is one of the few major scientists to be openly critical." Read more for details and analysis.

MIT psychologist vs. frightening predictions

from the to-tell-or-not-to-tell dept.
Prominent MIT psychologist Steven Pinker predicts in Technology Review: How far can this revolution in the human condition go? Will the world of 3000 be as unthinkable to us today as the world of 2000 would have been to our forebears a millennium ago?…The future, I suggest, will not be unrecognizably exotic because across all the dizzying changes that shaped the present and will shape the future one element remains constant: human nature…It is also far from certain that we will redesign human nature through genetic engineering. People are repulsed by genetically modified soybeans, let alone babies, and the risks and reservations surrounding germ-line engineering of the human brain may consign it to the fate of the nuclear-powered vacuum cleaner…Third-millennium futurologists should realize that their fantasies are scaring people to death. The preposterous world in which we interact only in cyberspace, choose the endings of our novels, merge with our computers and design our children from a catalogue gives people the creeps and turns them off to the genuine promise of technological progress.

Yes on brain repair & self-repair, no on AI?

from the vision:-two-yes,-one-no dept.
from the New Scientists' Next Generation Symposium site: "Welcome to the future…it's getting seriously strange out there as we head for the millennium. Below 24 young scientists working at the cutting edge bring you their thoughts and predictions. Check them out before you take your journey into the future… " Included:
Brain Repair in the 21st Century: "How much of the brain can be replaced before you require a new passport?"
Soft-condensed matter: "we could design desirable structures without actually having to build them, and if you break them, they will 'repair' themselves…we'd really like to have systems that completely self-assemble and produce hard bits and soft bits and valves and pistons and all the necessary things we need to make nanomachines, all from exploiting the properties of soft matter."
AI is possible, but AI won't happen: "there is no obvious way of getting from here to there–to human-level intelligence from the rather useless robots and brittle software programs that we have nowadays." [Yes, okay, it's not obvious.]

"New Economy"= early sign of coming Singularity?

from the dismal-scientists-boggled dept.
From a widely-published Boston Globe story: "Greenspan's comments…indicated his strengthening conviction that a stunning surge in the productivity of U.S. workers will persist…ensuring that the longest-running expansion in U.S. history has no end in sight…[his] main point was about the world economy and the profound impact of technology and globalization…Economists said such strong gains in productivity are unusual, if not unprecedented." He compared the current situation to the railroads, which "helped elevate economic growth for a considerable period of time. But the pace of growth eventually slowed when full or near-full exploitation of the newer technologies was achieved." But will we stop seeing such gaps between newer technologies, as they arrive with increasing frequency?

Bill Joy on powerful technologies & individual liberties

from the "Bill:-Foresight-IS-having-this-discussion" dept.
Foresight Advisor and Senior Associate RalphMerkle brings to our attention the latest from Bill Joy. An excerpt: "So these technologies are that powerful that they really threaten our individual liberties, if other people have so much power. And that's a discussion we have to have, and we have never really had that discussion in the context of power to individuals. Certainly, the nation-states have had the power to destroy civilization, but to give individuals that power or to threaten our liberties is really an unprecedented situation that we face in this century." OK, he has a point here. Not a new point, but an important one.

EoC 2000: Most important changes since 1986?

from the trying-to-figure-out-what's-going-on dept.
BryanBruns writes "In connection with the Engines of Creation 2000 project, it would be interesting to discuss what seem to be the most important changes to consider for revising Engines of Creation, and more generally for formulating scenarios and strategies to "prepare society for advanced technologies." Below is a short list, which might stimulate discussion about the most important changes to consider, and their implications:
End of the cold war: democratization, capitalist globalization, China joining WTO…
Weak and poorly deliberated policies for science and technology: OTA abolished. No science courts. Media focus on risks frames discussion of environment, nuclear, biotech and other technologies.
Silicon Valley rules: network economy, web, internet time, open source, etc.
No big breakthroughs yet in AI: IT industry investing heavily in nanoscale technologies to follow Moore's law, biotech advancing rapidly, suggesting nanotech likely before artificial intelligence".
Read More for implications.

0
    0
    Your Cart
    Your cart is emptyReturn to Shop