Last week at AGI-09, I chaired a one-day workshop on the future of AGI. (“AGI” means Artificial General Intelligence, which is essentially what the term “AI” meant before 1980: the attempt to build a system that would be the equivalent of a human in its thinking abilities, displaying a robust ability to think, converse, exhibit common sense, learn by observing or being told in natural language, and so forth.)

The first session was about economics. We had James Albus, noted roboticist, and Robin Hanson, whose analyses have been noted here before.

The bottom line of the session, if I may presume to sum it up, was pretty straightforward. If you have a human-level AI based on computer technology, the cost to do what it can do will begin to decline at Moore’s Law rates. Even if an AI costs a million dollars in, say, 2020, it’ll be a thousand in 2030 and one dollar in 2040 (give or take a decade). Why hire a human when you can buy the equivalent for a dollar? To put it as simply as possible, you aren’t going to be able to make a living by working. You’re going to need to have some capital. Everybody’s going to need some capital.

Albus has worked out a scheme in great detail how this might happen, starting from the current situation. You don’t have to agree with him whether the scheme is likely or even feasible, but it’s pretty clear that some basic economic shift is going to be necessary (and I can think of a lot worse things than Albus’ idea).

One way or the other, the human race is going to take an early retirement in the next few decades. I find this a much better way of thinking about what’s coming up than “singularity”. The term “singularity” was specifically created to reflect a notion that there was an event horizon associated with advancing AI. But whether or not this is true of the far future, some distinct profiles of the near future are clearly visible. And from what we can see of it, it is going to make a huge difference what we do now.

So, I think, we need a better term than “singularity” to describe what’s coming up. It should reflect the fact that there are indeed some things we can tell about what will be happening. It should, if possible, reflect the fact that this will be a major liberating event for the human race — no longer need we spend our lives in forced drudgery, since we have built machines to do the necessary work. But it should also reflect the fact that we need to be planning for it.

Chris Peterson, who also came up with the term “open source,” suggested “early retirement.” I can’t think of a better one. Can you?