Google Aims to Beat the Brain EE Times

Google Aims to Beat the Brain EE Times

LAKE WALES,Fla. — Google’s artificial-intelligence guru, Demis Hassabis, has unveiled the company’s grand plan to solve intelligenceby unraveling the algorithms, architectures, functions, and representations used in the human brain. But is that all there is to it?

No one disputes the basics of artificially intelligent neural networks, namely that brain neurons are connected by synapses with “weights” that grow stronger (learn) the more they are used and atrophy (forget) when seldom used. The European Union’s Blue Brain project, for instance, hopes to simulate on a supercomputer even the smallest details of how the brain works, so as to unravel the mechanisms behind maladies such as Parkinson’s and Alzheimer’s, as well as to build AI systems.

If all you want is AI cast in silicon (in silico, as opposed to in vivo — living organisms), however, engineers can get by with a mere understanding of the algorithms, architectures, functions, and representations used in brains, according to Hassabis, CEO of DeepMind Technologies, which Google acquired in 2014.

Demis HassabisDemis Hassabis

“From an engineering perspective, what works is ultimately all that matters. For our purposes, then, biological plausibility is a guide, not a strict requirement,” Hassabis and his co-authors argue in “Neuroscience-Inspired Artificial Intelligence,” published in the peer-reviewed journal Cell. “What we are interested in is a systems neuroscience-level understanding of the brain, namely the algorithms, architectures, functions, and representations it utilizes.

“By focusing on the computational and algorithmic levels, we gain transferrable insights into general mechanisms of brain function, while leaving room to accommodate the distinctive opportunities and challenges that arise when building intelligent machines in silico.”

For example, during sleep the hippocampus replays and re-associates the particularly successful learning experiences that have happened during each day, enabling long-term memory to secure lessons learned, even if only from a single instance. Simple machine-learning algorithms, by contrast, can wash out single learning instances with a clutter of insignificant details. But Google claims machine-learning algorithms can be constructed to mimic the functions in real brains.

“Experiences stored in a memory buffer can not only be used to gradually adjust the parameters of a deep network toward an optimal policy, but can also support rapid behavioral change based on an individual experience,” Hassabis et al. state.

Because learning algorithms tend to overwrite existing knowledge with new knowledge, getting neurocomputers to learn multistep tasks has been a tough nut for engineers to crack. The authors note that recent research has exploited synergies in neuroscience and engineering to address that conundrum. Neuroscientists’ discovery of synaptic lability (variable rates of change) in real brain synapses gave AI engineers a new tool to achieve multistep learning; they crafted learning algorithms that set the lability of earlier tasks at levels that prevent later tasks from overwriting them.

“Findings from neuroscience have inspired the development of AI algorithms that address the challenge of continual learning in deep neural networks by implementing of a form of ‘elastic’ weight consolidation, which acts by slowing down learning in a subset of network weights identified as important to previous tasks, thereby anchoring these parameters to previously found solutions,” the authors state. “This allows multiple tasks to be learned without an increase in network capacity, with weights shared efficiently between tasks with related structure.”

Hassabis et al. note that “much work is still needed to bridge the gap between machine- and human-level intelligence. In working toward closing this gap, we believe ideas from neuroscience will become increasingly indispensable.” Citing engineers’ success in enabling AI multistep learning by reproducing the biological mechanism, the authors call for neuroscientists and AI engineers to join ranks to solve “what is perhaps the hardest challenge for AI research: to build an agent that can plan hierarchically, is truly creative, and can generate solutions to challenges that currently elude even the human mind.”

Not everybody agrees, however, that to crack the code of human intelligence we need merely understand the brain’s algorithms, architecture, functions, and representations. There is a counterargument that the brain’s “code” is the same for all living intelligence in the universe, just as chemistry is the universal code for which the alchemists searched. Likewise, the brain’s code for intelligence will result in a body of intertwined universal principles similar to chemistry and physics.

Countering Google's argument, neuroscientist Pascal Kaufmann, founder of Starmind International, says brain code is not an algorithm, because the brain is not a computer. Source: StarmindCountering Google�s argument, neuroscientist Pascal Kaufmann, founder of Starmind International, says brain code is not an algorithm, because the brain is not a computer.
Source: Starmind

“We need to crack the brain’s code for a genuine understanding of intelligence, and computer software alone is not sufficient. The brain is just likened to a computer because of the age in which we live, just as it was likened to a steam engine in prior eras,” said neuroscientist and tech entrepreneur Pascal Kaufmann, founder of AI software company Starmind International AG (Küsnacht, Switzerland). “Just as physics is the code of all the physical phenomena in the universe, the brain’s code will similarly be based on principles that are universal in nature.

“The same principles are applied again and again in nature — the way a tree’s branchings are very similar to veins and arteries in the body,” Kaufmann said. “You just have to ask the right questions.”

— R. Colin Johnson, Advanced Technology Editor, EE Times “Circle

Related articles:

  • 'E-Brain' to Pilot Air Force Jets
  • Is Elon Musk's Brain Cap Viable?
  • Cray Rolls Clustered Supercomputers for AI
  • Intel Banks on Artificial Intelligence
  • Cray Sets Deep Learning Milestone

PreviousReport: Samsung Plans to Triple Foundry Market Share EE Times
Next    Semiconductor M&A Fervor Cools EE Times