Making computers more error prone could make them much faster and more powerful

Joseph Bates, a longtime Foresight member and computer scientist at Carnegie Mellon and the MIT Media Lab was described in the “Innovator” feature of the January 31-February 6, 2011 issue of Bloomberg Businessweek (unfortunately I could not find this article on the web) as predicting that a computer that ignored some processing tasks necessary to obtain precisely accurate results “would have something like 100,000 times the computing power of a traditional processor.” The advantage of such an approach is that an error of about one percent would produce small or even imperceptible errors for many applications, but would be enormously faster in processing databases, quickly shrinking huge lists of possibilities to small lists that could then be checked by “traditional Intel-style chips” to produce final, precisely accurate results. Bates is quoted as saying that several companies are considering the technology. Bates became interested in A.I. as a boy by reading Isaac Asimov, and believes that the chip he has designed will help computers act more like the human brain by taking shortcuts to “guesstimate” answers. “‘By allowing things to be approximate, you’re a lot closer’ to achieving true artificial intelligence, says Bates.”

Leave a comment