Tuesday, September 6, 2016

Learning about learning

I recently attended the INCF (International Neuroinformatics Coordinating Facility) short courses and congress in Reading.  It was quite wide-ranging with some people working primarily on MRI imaging, others on modelling of synaptic plasticity and learning algorithms, and quite a few other topics.

One area I was not really aware of before the conference was neuromorphic computing, which is about designing and building computing hardware based on principles of how the brain does computation.  At the INCF short courses, this was presented by Giacomo Indiveri, and I subsequently looked at an introductory article by Steve Furber, who has lead the SpiNNaker project,

http://digital-library.theiet.org/content/journals/10.1049/iet-cdt.2015.0171

I am quite impressed by the dedication of people working in this field.  Steve Furber says in his article that SpiNNaker has been 15 years in conception and 10 years in construction.  This is enabling fast simulation of large-scale neural models, such as Spaun.  On a standard computer, Spaun requires 2.5 hours of computation per second of real time.  The system can perform simple cognitive tasks such as reinforcement learning and arithmetic.  SpiNNaker aims to run Spaun in real time.

In the next few years, as part of the Human Brain Project, SpiNNaker will be used for larger models, and presumably be tested on progressively more demanding cognitive tasks.  From my perspective, I am interested to see how large-scale neural models of biological intelligence will compare to engineered intelligence systems such as deep neural networks.

Engineered intelligence is free from the constraint of having to be faithful to biology.  This gives it a massive advantage over simulated neural models when it comes to performing tasks.  Ideas from biology have been influential in machine learning and artificial intelligence, but they have been heavily supplemented by numerical analysis and statistical computing.

At the moment many machine learning algorithms require huge amounts of computing power.  So it will be interesting to see whether any new hardware emerges that can bring this down.  It would be cool if state-of-the-art machine learning algorithms that today require the use of a supercomputer, could be run on an affordable battery operated device.  And it will be interesting to see if the new neuromorphic machines that are emerging will drive engineers and scientists to further develop learning algorithms.

No comments:

Post a Comment