Associative memories

AI researchers in the 80s ran into a problem: the more their systems knew, the slower they ran.  Whereas we know that people who learn more tend to get faster (and better in other ways) at whatever it is they’re doing. The solution, of course, is: Duh. the brain doesn’t work like a von Neumann… Continue reading Associative memories

Baytubes

Bayer (the same company that makes the aspirin) is now beginning to manufacture multi-walled carbon nanotubes in industrial quantities.  The pilot plant will produce 200 tons per year, and the market is expected to grow at 25% per year. The MWCNTs are for materials use, meaning mostly fiber-reinforced composites, e.g. airplanes, tennis racquets, arrows, and… Continue reading Baytubes

Learning and search

So we will take it as given, or at least observed in some cases and reasonably likely in general, that AI can, at the current state of the programming art, handle any particular well-specified task, given enough (human) programming effort aimed at that one task. We can be a bit more specific about what “well-specified”… Continue reading Learning and search

Steam balloons

The brothers Montgolfier invented the hot air balloon upon the observation that smoke rises, and thus they figured that if they could catch it in a bag, the bag would be pulled upward. Hot air ballooning is quite popular today; people think of balloons as being quaint and pretty and natural, or at least more… Continue reading Steam balloons

Gada Prize update

We’ve had a fair amount of interest in the Kartik M. Gada Humanitarian Innovation Prizes, mostly from RepRap types. They pointed out that we had a slight incompatibility in the specification of the open source requirements with those of the RepRap community itself. We’ve changed the requirements to allow either BSD or GPL. To make… Continue reading Gada Prize update

The Sigil of Scoteia

At the Foresight congerence special-interest lunch on IQ tests for AI, Monica Anderson suggested a test involving separating text which had had spaces and punctuation removed, back into words.  As a somewhat whimsical version of the test, I suggested the Sigil of Scoteia: In case you’re unfamiliar with it, it’s the frontispiece of the novel… Continue reading The Sigil of Scoteia

AI: how close are we?

In the terminology I introduced in Beyond AI, all the AI we have right now is distinctly hypohuman: The overall question we are considering, is AI possible, can be summed up essentially as “is diahuman AI possible?”  The range of things humans can do, done as flexibly as humans can do them, and learned the… Continue reading AI: how close are we?

A brief history of AI

40s: Cybernetics, the notion the brain did logic in circuits, feedback 50s: the computer, stored programs, Logic Theorist 60s: LISP, semantic nets, GOFAI 70s: SHRDLU, AM 80s: AI winter, expert systems, neural nets 90s: robots, machine learning 00s: DARPA grand challenge level of competence The main point of this post is to answer any objections… Continue reading A brief history of AI

Is AI really possible?

I’m about to start a series of posts on the topic of why I think AI is actually possible.  I realize that most of the readers here don’t probably need too much convincing on that subject, but you’d be surprised how many very smart people, many of them professors of computer science, are skeptical to… Continue reading Is AI really possible?

Feynman anniversary event to be held at University of South Carolina

Feynman anniversary event to be held at University of South Carolina. h/t Nanowerk In February 1960, the Caltech magazine Engineering & Science published Feynman’s “Plenty of Room”, and it has been re-published ten times since then. This has become one of the best-known papers in the history of nanotechnology. The fiftieth anniversary of the initial… Continue reading Feynman anniversary event to be held at University of South Carolina

0
    0
    Your Cart
    Your cart is emptyReturn to Shop