At first glance, our soft brain has nothing in common with the hard silicon chips of computer processors. However, scientists have been comparing the two for some time. The British mathematician and logician Alan Turing put it this way in 1952: It doesn’t matter if the brain has the consistency of cold porridge. In other words, the means is not important. Only computing power matters. The most powerful artificial intelligence (AI) systems today use a type of machine learning known as deep learning. The algorithms for this are inspired by the human brain and learn by processing massive amounts of data using what are known as deep neural networks. These consist of several layers of interconnected ganglia or processing units that are reminiscent of real neurons.
At least they are modeled on what neuroscientists knew about neurons in the 1950s. It was then that an influential neural model called the Perceptron was developed. Since then, our understanding of the computing power of individual brain cells has improved dramatically. We now know that they are much more complex than their synthetic counterparts – but how exactly?
“Alcohol buff. Troublemaker. Introvert. Student. Social media lover. Web ninja. Bacon fan. Reader.”