As a brain exercise I decided to try and estimate the total processing power of all the computers on the entire internet, and see if that is enough processing power to emulate the human brain.
Since this is an estimate, I will try my best to figure it out with public data.
Here was my process:
Total Number of Computers
No one knows exactly how many people (computers) are connected to the internet since a single IP address can be shared with any number of pc’s, but the best estimate I was able to find was here: Internet World Stats
They use population statistics and penetration data to estimate this. This does not include extraneous connected devices like gaming consoles, or other devices.
Total Computers: 1,733,993,741 (1.7 Billion)
Effective Processing Power per Node
Since all the computers are separated by the internet, I chose to use SETI@home as a benchmark for the effective benchmark of how much processing on average a node can handle
According to BOINC stats, Seti has 186,250 active computers (more inactive ones), and the peak processing speed is 704.507 TeraFLOPS (tF). (As of 01/17/2010)
That’s 3.782 GigaFLOPS (gF) per node.
Also, on average computers running SETI are probably faster then the average computer online, especially considering poorer countries. So lets round down the average to 3gF.
3 GigaFLOPS is much lower then the peak processing power of a modern computer, a (Intel Core i7 965 XE), can process about 70, while some of the newest GPU cards can process upwards of 1,000 GigaFLOPS.
This leaves me to believe that tele-processing is very inefficient compared to traditional supercomputers, though it is not without advantage.
Total Processing Power
Now, assume all 1.7B computers were all running this hypothetical peer to peer processing application, and bandwidth was not a bottle neck.
1,733,993,741 * 3 = (5,201,981.223 TeraFLOPS, or 5.2 ExaFLOPS)
In case your wondering:
1 Peta = 1,000 Tera
1 Exa = 1,000 Peta
Currently the fastest supercomputer is “Jaguar” with 2,331 TeraFLOPS. (Top 500 November 2009)
Emulating the Human Brain
The Blue Brain Project has successfully simulated 1 cordical column on the Magerit super computer (100 TeraFLOPS), and human brain has an estimated 1,000,000 columns.
So with 5.2 ExaFlops, one could emulate 52,000 cordial columns. And that’s 1/20th of enough to emulate a human brain.
To emulate the entire human brain you would need approximately 100 ExaFLOPs
That's a lot of FLOPS!
I chose the above estimation because many others rely on calculating how many operations per second the brain can processes, not how many computer calculations to emulate the brain. I.E. It takes many more computer computations to emulate a brain computation. Just like any hardware emulator is inefficient, a brain emulator is very inefficient.
Fore some more notes see: http://www.smartcomputing.com/articles/2002/s1302/39s02/39s02.pdf
After reading more about the Blue Brain Project, they are emulating a cordial column, as well as analyzing and visualizing that data, which is not necessary for a independent AI.That coupled with the inefficiencies in communication with tele-processing, I suspect that there is a lot of improvement that can be made.
Henry Markham in his recent TED talk, thinks that a human brain could be emulated in 10 years.
I agree with his estimate, using moore's law 10 years seems like an achievable goal.
If you are interested in the subject you may also like Ray Kurzweil's research and books.
More likely a successful AI would be a efficient Emergent-type hive AI. Where independent nodes contribute to a beneficial goal with independent actions. This would not look like a traditional AI to us, the patterns would probably to numerous and vague to track accurately.
This was meant as a thought experiment, if you have any comments please leave them. I will revise this article as I see fit.