So I may have mentioned Moore's Law in one of my recent articles. I think that because I have built an old computer, I think it would be wise to consider a few things, and about computing in general.
It is said that silicon fabrication processes can't scale down that much further, signaling an end to Moore's Law. But that's OK, because we would finally reap its benefits in mass. We have supercomputers in our pockets, and we have even better ones in our homes that can show very pretty things. Computers have become cheap, ubiquitous, powerful, and efficient. If processing power stagnates for a bit before the next breakthrough, I suppose we will just have to live with it. The end of Dennard scaling means that if we want more power, we will have to learn to use multiple core CPUs properly.
Even if we do, most of a processor might still be doing nothing at all, because it will be mostly off, due to power concerns. It's a bitter pill to swallow. Imagine most of a thousand core CPU being off. If all of those are general purpose CPU cores, something failed horrifically in the design process. If those are hundreds of different specialized cores, then it would be reasonable. Maybe heterogeneous processors will save us all? AMD and Intel already have that idea, of bolting on a GPU to a CPU.
Lately, there has been much ado about the "Post PC" era. Suddenly, no one's talking about how awesome PCs are; everyone's hopped up on the Apple/Android juice, and everyone is saying "Goodbye PC". Marketing jerks are hyping up the fact that no one is buying PCs anymore, nevermind that PCs are still being sold. They point to the fact that everyone is buying these smaller devices in huge numbers instead. In case you haven't realized, today's PCs are awesome, so you don't have to buy one every two or three years. That's why there's not the exponential hockey stick line on the sales charts that everyone loves these days.
Last year, there was a big scare for a few days that Intel would no longer be selling socketed CPUs. Thankfully, this was not accurate. But it looks like that future might happen anyway, whether we like it or not. In the latest breakdown of the trash can Mac Pros, everything is custom built: soldered CPU, no graphics cards, funny shaped heatsink, and minimal upgradability. If you consider how the circuitry of CPUs has gotten smaller over the decades, the interfaces and sockets used to connect them have not shrunk proportionately. It looks like we still have plenty of room left in conventional modular PC architectures that we won't have to solder everything down just yet.
This has led me to think another possibility. In 2033 (AKA a long time ahead), when the GoogARMIPS architecture has dethroned x86 as the dominant CPU architecture, what will happen? Will everyone happily surrender 30 years of computing culture? If everyone did, it would be like English suddenly vanishing, along with everything that was made with it. Hundreds of years of culture, economics, and history would suddenly be gone. In 2034, will I build a 2013 PC to play 2015 PC games?
Until then, enjoy the end of Moore's Law. We literally have boatloads of cool stuff, until none of it comes from China.