A revolutionary development in artificial intelligence (AI) is set to see the complexity of one of the most common AI methods increase dramatically. And if this AI breakthrough reaches its potential, it would bring vastly improved AI capabilities to industry, medicine and science.
The AI breakthrough, developed by data science professors at the University of Derby and published today in the journal Nature Communications, focuses on artificial neural networks (ANN), one of the most commonly used AI methods.
ANN are arguably responsible for many of the advances known as the AI revolution.
They are deployed in many of the products we use every day, from software with intelligent capabilities, such as Google’s Photos app, to virtual assistants such as Apple’s Siri. ANN are also used extensively by companies such as Amazon, Facebook and Netflix across many aspects of their business. They are also driving advances in healthcare.
“Artificial neural networks are at the very heart of the artificial intelligence revolution that is shaping every aspect of society and technology,” said Antonio Liotta, professor of data science and director the Data Science Research Centre at the University of Derby.
“They have led to major breakthroughs in various domains, including speech recognition and computer vision.”
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataHowever, as revolutionary as ANN have so far proved to be, they remain limited in their capabilities.
“The networks we have been able to handle so far are nowhere near the capacity of the human brain – made up of billions of neurons,” said Liotta.
“The very latest supercomputers would struggle with a 16 million neuron network the size of a frog’s brain, while it would take more than a dozen days for a powerful desktop computer to process a mere 100 million neuron network.”
AI breakthrough: transforming the structure of artificial neural networks with sparse evolutionary training
The AI breakthough that Liotta and his colleagues have developed takes the form on an updated method known as sparse evolutionary training. This involves restructuring the connections of artificial neural networks so that they more closely match organic neural networks.
At present, neural networks start out with layers that are fully connected to each other, before being trained to achieve their intended purpose. However, with sparse evolutionary training they do not need to have fully-connected layers, instead being more sparsely connected before being trained using the team’s newly developed algorithm.
Using this sparse evolutionary training algorithm, the researchers were able to significantly increase the number of parameters involved – meaning networks with far more neurons can be created using the same computing power.
In addition, this approach also means that the computing power required to develop current levels of neural networks is reduced, meaning developers can produce AI networks on considerably less powerful machines.
Creating artificial neural networks to rival the human brain
In time the researchers believe this breakthrough will allow the creation of artificial neural networks made up of billions of neurons, allowing them to be as complex – or close to – the human brain.
However, it is important to note that this would not be the same as creating an artificial human brain. Neural networks of this complexity would not function like a human brain, merely have similar computational power.
Nevertheless, this AI breakthough would be hugely significant, and Liotta and his colleague’s early work shows enormous potential. They have already demonstrated sparse evolutionary training’s potential with a host of different industry datasets, showing it will be immediately beneficial to data scientists in both commercially minded and academic environments.
“We have benchmarked our approach on 15 datasets from different problem domains including genetics, biology, natural language processing, imaging and particle physics,” said Liotta.
“This work represents a major breakthrough in fundamental artificial intelligence and has immediate practical implications in industry and academia alike, enabling the analysis of vast sets of data, beyond what is currently possible.”
Prototype software for the sparse evolutionary training method is can be accessed via GitHub.