For the last 50 years or so, the land of computers has been ruled by Moore’s Law, but that time is coming to an end.

TLDR; Moore’s Law is dying, but instead of going smaller with transistors for computing power, we will go large with scale, AI and quantum computer for computing power

For those of you unfamiliar with Moore’s Law, it was an observation made in the 1960s by Gordon Moore, that the number of transistors in an integrated circuit would double approximately every two years. Moore, who went on to form Intel, was quite prescient in this declaration.

Intel released its first chip in 1971 and it contained approximately 2300 transistors. Fast forward to today, where a quad core Skylake chip from Intel has approximately 1.75 billion transistors. Obviously, the last 45 years has seen amazing gains in the engineering and the manufacturing processes involved in making chips for computing devices, so much so, the supercomputers of 30 years ago are the $500 smartphones we carry in our pockets.

Moore's Law source: Wikipedia

This growth in chip size has transformed our world. Between the rise of computers, then smartphones and high speed data networks, the world is a different place. Communication costs have plummeted. We are more connected globally than ever before. Opportunities have expanded globally for millions and millions of people. The economic and social benefits of this have been tremendous. It has been breathtaking in the scope of impact it has had on society; from the inane tweets of Kardashians to the Arab spring.

So Moore’s Law has been a huge success, however, it is more of an observation than a physical law. For example, Newton’s Law of Gravity is (almost always) the law of the Universe with F = ma. Moore’s Law though is not that, and its time is coming to an end. The amazing advances in the reduction of chip size will soon be a thing of the past. Without Moore’s Law, which helped create the economic engine of the Information age, does that mean GDP growth will be disappearing? Of course not, at least in terms of computing power. Instead of Moore’s law being about doubling of transistors on a chip, we now think of computation growth coming from other places. First off, to the ancient game of Go!

The game of Go is a 2500-year-old game from China that is played with black and white stones on a 19x19 grid. The game has relatively simple rules, but the size of the board makes it an exceedingly complex game. While games like Chess are complex, the pieces have limited moves and varying strength, so it is somewhat easier to determine the relative value of certain moves and use brute force computation to analyze all the possible outcomes on the, relatively smaller, 8x8 chess board. In Go, the moves are much more nuanced, and players take years and years of playing, to recognize patterns to be a successful player. In computer terms, chess is a solvable problem by applying computation as demonstrated by Deep Blue back in the 90s when world Chess champion Gary Kasparov was (nerd) famously beat.

Go Game

The approach used by Deep Blue though, wouldn’t work with Go, so it has always been considered an artificial intelligence problem, more than a CPU intensive problem. The ability to beat the Go equivalent of a grandmaster was thought to be at least a decade away. So this March (2016), much of the computer world, was amazed then to see Google’s AlphaGo (developed by Google DeepMind) beat a Korean champion Lee Se-Dol in a 5 game battle. You can read about the matches at The Verge.

We learn two things from this… First, we can replace the previous decades of massive growth of transistors on a chip with the massive growth of computing resources at scale. Google’s DeepMind program utilized a cluster of computers to derive massive computational resources to solving a very difficult problem. As connectivity becomes more pervasive and network speed grow, these computing resources can be used to solve complex problems, without needing more CPU resources on our devices.

The second thing we learn from Google’s DeepMind victory against a Go champion is that there are other solutions to problems. While computing power could win a game of chess, it would never win at something like Go. Instead of the brute force approach that was so effective for Deep Blue, different solutions were needed. Many years of research came to fruition, including pattern recognition, deep learning and neural networks that were bundled into DeepMind. The AlphaGo program was taught to learn by “watching” old Go games from champions, and then constantly playing itself. The more it learned, the better it became. So much so, that during the actual tournament, commentators were shocked by some of the moves the AI made, and there was one move the human player was so taken aback by, he took a very long break. This kind of growth in alternative approaches to problem solving can again, provide greater computational power while not increasing transistor size.

Last, another future path to computation growth is quantum computing. Quantum computing requires a bit of a deeper dive (future blog post!), but suffice to say, the world of qbits and superposition principle provides orders and orders of magnitude increases in speed for some computational cases.

So…. Moore’s law is going away. We are reaching the literal physical limitations of how much closer we can put transistors on a chip. We are also moving to a world where lower power usage, which equates to less chip performance is much more preferred than previous decades. This all means different computing platforms will help a variations of Moore’s law continue, where computing power continues to increase, but it will be distributed into massive scale cloud infrastructure with novel approaches like neural networks and quantum computing coupled with speedy networks to make sure today’s super computers are in our pockets in 30 years.

This blog post originally appeared at Skyline Technologies.