D'Souza: Tech progress can bring moral progress

from the both-gains-and-dangers dept.
Foresight director Jim Bennett brings to our attention this item from Red Herring by Dinesh D'Souza on whether technology can further tradition human values: "The critics focus on the moral dangers of technology. Those dangers–of technological hubris and undermining human dignity–do exist, and we should debate them. But what the critics miss is the possibility of moral gains. Used correctly, technology can generate moral progress by strengthening and affirming our highest values, as we have seen it do in the past. Technology doesn't just offer us the chance to be better off; it offers us the chance to make a better society." His examples are the ending of slavery, emancipation of women, and extending human lifespan.

International Law vs. Human Cloning

from the What-would-Jerry-Lewis-Say? dept.
ChrisPhoenix writes "It seems that France doesn't like human cloning at all. Some prominent people over there are even calling for making international law and/or establishing an international court for bioethical violations that would be able to take action even if the "violation" were not illegal in the jurisdiction where it was done.

On one hand, those who think that nanotech needs at least a little worldwide regulation can hope that science controversies may spur the creation of regulatory bodies.

On the other hand, those who worry about luddites blocking important technologies now have more to worry about.

The story appeared in "French Advances in Science and Technology".
This link will hopefully get you to the right issue (#297).

Chris"

On ethics: engineers compared to programmers

from the is-there-a-programmers-code-of-ethics dept.
An engineer points out that the engineering professions have codes of ethics that guide them in designing for safety and in communicating with the public, and that these are not being followed by some programmers: "It seems to me that as a nanomechanical engineer (or rather a mechanical engineer with an interest in nanomechanics), I was permanently "conditioned" to always think of the ethics behind a design by "upping the safety factor" on my design. If your daily dealing with deadly machines — or rather machines that can cause harm to the public — then as an engineer I am obligated to design something that has a higher safety factor than someone who designs air conditioning systems. This fact, or rather a standard rule of engineering practice, has been overlooked by Joy and his counterparts…My point is this, that as a general rule engineers create highly reliable systems that must have a certain safety factor included whenever there is the slightest chance of harm to the public. I, or rather we, as engineers (Civil, mechanical, (High power) Electrical), are always worried about the safety factors we have set on a design…I think I might be over simplifying the issue but ultimately I have a hard time understanding why so many programmers are a) claiming to have a solid understanding of safety factors when their job has only a few instances for life or death of the public and b), if they follow the same code of ethics set forth by the state governmental systems, why is it they are not lumped together with the rest of the classically trained engineers? One other thing, if they are, then didn't they see that part in the ethics section about slandering or speaking out to the public on something in which they have no educated knowledge?… I would love to hear what the software "engineers" have to say about this particular essay in order for me to learn more about their trade." Read More for the full post.

Engineers seen as unable to make moral decisions

from the who-else-is-even-paying-attention? dept.
From a Newsweek article on MSNBC on the coming age of cyborgs: "Who, then, can speak on moral issues? Certainly not the engineers. Ellen Ullman, a former computer programmer and the author of the 1997 book Close to the Machine: Technology and Its Discontents, says that 'the problem is not the technology, which in any event canít be stopped. The problem is that engineers are making decisions for the rest of us.' Programmers are hired guns, she says, and rarely understand in a nuanced way their clientsí actual work. They are, she says, the last people 'to understand what is an acceptable risk.' " CP: In Foresight's experience, programmers and engineers are far more attentive to ethical issues in technology than members of other professions.

0
    0
    Your Cart
    Your cart is emptyReturn to Shop