The turf you choose to visit has always favored computers, possibly because the question is not “Who Is Smarter”?
Randall Munroe argues that the question about the perception of, say a picture, is better-answered by mothers than computers. One such example he gave is as follows.
When he sent this picture to his mother while asking about her thoughts on what is going on in this picture, she said: ” It was the kid who knocked over the vase, and that cat was merely investigating.” However, he had some other alternatives in mind like:
- It was the cat who knocked over the vase.
- The cat became lunatic and chased the kid who used the rope to climb the dresser in self-defense.
- The cat was playing peekaboo with the kid, so it jumped out of the vase while knocking it over.
- A wildcat invaded the home and humanity was rescued by using vase-missiles.
- The rope that held the vase together was pulled, and the cat was preparing it for surgery.
Any computer can neither perceive what his mother said with all the certainty nor it can imagine all other possibilities. This form of intelligence is not expected of a computer because the computers are not programmed that way.
The argument here is that computers rely on humans to ‘teach’ them how to think. To make it fair, let’s talk about what computers can do and so can humans. If humans attempt benchmark calculations of a computer chip by hand, and by hand, I mean paper-pencil. A mobile phone, with midrange processor, can calculate 70 times quicker than all the human population. This factor will rise to 1500 for a high-class chip of a desktop PC.
If you are wondering about that point in history when a single ordinary desktop PC outperformed the collective human computing power, we can tell you it was 1994. At the time, 5.5 billion humans collectively processed 65 million instructions per second, but the new Pentium chip by Intel was outshining humans by 70-80 million instructions per second. What if we combine the processing power of all the computers that exist in the world. We are done!
Just as it is impossible to train all humans to calculate, not all microchips are designed to compute. Let us not offend all the chips and assume that the combined computing power is equivalent to the number of transistors in the world. This assumption allows us to be fair because 1980′ s processor and processors from today are more or less working with 30 transistors per instruction per second. Using this calculation, the combined computing power of all the computers in 1977 has exceeded the combined computing power of all humans.
I have been more than fair to the computers. It’s time that I stand with my kind, humans. If the complexity of the human brain is brought into the light, it will exceed a supercomputer in sophistication. There have been various studies to discover how supercomputers can be made to imitate a brain at the granularity of single synapse. K supercomputers in Japan conducted such a project in 2013 and suggested that 1015 transistors can replace the functionality of 1 human brain; the combined complexity of all logical circuits of 1988 is equal to the single human brain. Hence, the computers might catch up to these human complexities somewhere in the year 2036.
To summarize, the computer asked humans to simulate operations and concluded the performance to 0.01 million instructions per second. The humans asked computers to simulate single neuron firing of the brain and concluded it close to 50, 000,000,000 million instructions per second. How to decide then? By geometric means, humans process 30,000 million instructions per second, roughly the same as the computer you are using to read this story.
If we assume that our computer programs are as inefficient at simulating human brain activity as human brains are at simulating computer chip activity, then maybe a more fair brain power rating would be the geometric mean of the two numbers. The combined figure suggests that the human brains clock in at about 30,000 MIPS, just equal to the computer on which I’m typing these words. It also suggests that the Earth’s digital complexity overtook its human neurological complexity in 2004.
The turf and domains shift as does our perspective when the baseline of comparison changes. We can argue all day long about apples and oranges. Let’s face it; apples are good for you and oranges just taste better.