A new multi-disciplinary collaboration will focus on developing computing technology that mimics the human brain in being able to solve a wide variety of problems. One of the difficulties that the researchers will confront is making a nanoscale material that can mimic the synapses connecting neurons by forming connections that will get stronger or weaker depending on the signals passing through them. From “IBM plans ‘brain-like’ computers“, by BBC News science and technology reporter Jason Palmer:
IBM has announced it will lead a US government-funded collaboration to make electronic circuits that mimic brains.
Part of a field called “cognitive computing”, the research will bring together neurobiologists, computer and materials scientists and psychologists.
As a first step in its research the project has been granted $4.9m (£3.27m) from US defence agency Darpa.
The resulting technology could be used for large-scale data analysis, decision making or even image recognition.
“The mind has an amazing ability to integrate ambiguous information across the senses, and it can effortlessly create the categories of time, space, object, and interrelationship from the sensory data,” says Dharmendra Modha, the IBM scientist who is heading the collaboration.
“There are no computers that can even remotely approach the remarkable feats the mind performs,” he said.
“The key idea of cognitive computing is to engineer mind-like intelligent machines by reverse engineering the structure, dynamics, function and behaviour of the brain.”
IBM will join five US universities in an ambitious effort to integrate what is known from real biological systems with the results of supercomputer simulations of neurons. The team will then aim to produce for the first time an electronic system that behaves as the simulations do.
The longer-term goal is to create a system with the level of complexity of a cat’s brain.
…Free from the constraints of explicitly programmed function, computers could gather together disparate information, weigh it based on experience, form memory independently and arguably begin to solve problems in a way that has so far been the preserve of what we call “thinking”.