I like to think of myself as a hardware guy. Meaning, I know the latest hardware trends and specs and am interested in latest development in hardware design. But I can't deny the fact that my younger days are far behind me, and my idea of how hardware should be designed and function was formed mostly in the 90's and early 2000's. Acknowledging that, I have realized that most of my computing hardware experience has been under Intel's X86 domination. Even when I built my AMD based desktops or two, they were still X86 designs. For most of my life, X86 ruled the land in market share, and performance (for the most part). Even Apple had to give into Intel's might back in mid 2000's when PowerPC couldn't keep up with Intel's manufacturing advancements. But as we approach the impending transition from Intel by Apple for their own homebrew ARM A Series SoC for their Macs, this might also spell the beginning of the end for Intel X86/X64 as the dominant architecture for mainline computers. As Intel fumbles and bumbles to get their 10nm back on schedule (it's already several cycle late), Apple is already cranking out millions of 7nm A12s for the iPhones and iPads. Sure, you can make a case that you can't fairly compare Intel's process node with TSMC or GlobalFoundary's process node that Apple uses, but nevertheless, Intel has been stuck on their 14nm for at least 3 generation and that is never a good sign. The fact of the matter is, even without manufacturing issues, the future of X86/X64 seems numbered. It's no secret that at ultra low voltage, Intel's Core SoCs struggles against the best ARM chips from Apple and Qualcomm. Intel's Core processors really needs lots of wattage to perform well, and while they are capable enough at around 5W, being bested by glorified phone SoCs these days means that ULV is not where Intel's Core architecture is best suited for. Intel has also attempted to warm over the Pentium 3 into Atom ULV architecture but we all know where that ended up. Intel simply can't compete with ARM in ULV realm. And now ARM is encroaching in the high performance mobile realm as well, with the like of Apple's A series and Qualcomm's 800 series, now morphing into 8000 series where there will be a separate SKUs for Windows On ARM laptop and 2 in 1 devices. While Intel struggles to keep 3 billion transistors down to 45W of power consumption, Apple and Qualcomm can get their 5~7 billion transistors working fine at less than 15W. For now Intel has performance edge at over 45W, but that's not where Apple and the rest of the industry wants to be at for mobile designs. They keep building thinner and lighter devices that can't deal with top Intel SoC's cooling requirements. Look at the likes of Dell XPS, Lenovo X1 Extreme and Macbook Pro. Even at 15.6" footprint, they can't cool the Core i7 and i9 Hexacores well enough to keep them from throttling. So you end up paying for SoC you can't take full performance advantage of. It's no wonder Apple is switching their mobile Macs to A series SoC by 2020. Apple values thin and light over all else. Intel won't help them get there. But with all this thinness and lightness chase intensifies, leaving Intel in precarious situation, I am worried for different reason. I am a hardware guy and as a hardware guy, I like to tinker and fix. But also I'm lazy, so I'm not going to learn to solder with microscopes to thinker and fix. And as ODMs pursue thinner and lighter, the days of SoDIMMs and M.2/SATA storage are also seemingly numbered. Already many of the 13.3" and smaller laptops and 2 in 1 have soldered on RAM. Even some of the 15" laptops have soldered RAM (I'm looking at you Samsung Notebook 9 Pro). And as likes of Apple sets the trend for device designs, even the SSD will be soldered on the board. On the board level, I will make a very safe assumption that the 2020 13.3" Macbook Pro's board will look indistinguishable from 2020 iPad Pro's board. The only distinction will be the chassis design via physical UI (mouse and KB vs Touchscreen and Apple Pencil). So as Idiocracy of the hardware world becomes closer to reality, in say 15 years, only the weirdos and super geeks will be using devices that can be tinkered with and modified (i.e. desktops). You will buy all your mobile device from the factory as will be for the rest of the device's life. Laptop form factor will probably have merged by then with phone and will become disposable commodities. Perhaps by then, all the real computing will be done by megacorps' farms in the cloud, and we will just use our devices to borrow their computing power. Streaming information they deem OK for us to use. Welcome to Costco, I love you.