Y'all are doing the PC wrong.
Ever since the computers shrunk from gianormous mainframes to literal desktops (as in, boxes you are supposed to place on your desks), all the rage in the PC market seems to have mainly been performance.
It does not surprise me in the slightest. People love the fastest, the biggest, and all kinds of “the best-est” items. Naturally then, the main property of a computer that most seem to care about is performance. Performance above all else.
In this Tour de benchmark#1 we seem to have forgotten about the less obvious factor, which is power efficiency. It kept everybody responsible for new designs awake at night:
Apple has now changed their CPU vendors twice, because the former ones could not deliver when it came to thermal package, power draw and performance combo. It has led to multiple meme products, such as the G4 Cube (which I still find a beautiful broken dream today), the “trash can” Mac Pro of 2013, the Retina Macbook 12" of 2015 and all the laptops until 2020. They weren’t necessarily the fault of Apple designers, but rather the CPU vendors’, who did not meet their roadmaps and promises.
Qualcomm had to re-lease the same SoC (MSM8974, SD800/801) over 3 times publicly, add to that multiple internal iterations that are mentioned in Linux driver sources and keep in mind that every iteration brought multiple SKUs (or in other words, “speed bins”) because they could not manage the heat output of their !!mobile!! chips.. And they weren’t even comparatively that fast by the time Qualcomm got their things together..
Qualcomm, again, totally messed up at least 5 more times: with MSM8939 (SD615, 8-core midrange room heater), MSM8956/76 (SD650/652 6/8-core midrange room heater), MSM8994 (SD810, codename: plutonium, very adequate considering it also required many iterations and even the last ones could still help you in the winter), MSM8992 (SD808, a cut-down-by-two-cores* MSM8994), MSM8996 (SD820 a design so bad it couldn’t perform FP operations without an erratum and yet so hot even Qcom admitted it was a problem and re-released it as MSM8996Pro/SD821) and the now-(in)famous SM8350 (SD888) that is essentially a tablet chip, providing mobile phone-like performance with a laptop-like TDP, because “LOOK AT US, »WE HAVE CORTEX-X1!!!!«”
Samsung, with their consistenly overheating Exynos chips that I’m honestly too lazy to even list out.. With some exceptions, they have been performing worse than their Snapdragon counterparts while at the same time outputting equally, or even more ridiculous amounts of heat.
NVidia, with their power-devouring GPUs… I can’t even begin to talk about how much I loathe the fact that they dare to put three-digit-numbers-of-watts-consuming GPUs into laptops… Who in their right mind would like to have a jet engine consuming a nuclear-plant-amount of power in a so-called “portable” machine.. I bet nobody, but the lack of power-efficient competition leaves little to no choice.. And don’t even get me started about their desktop counterparts, I realize that everybody likes seeing “more fasturr”, but there’s really no need to sell 3-PCIe-slots-wide behemoths for N generations, completely ignoring the low-power-but-decent-perf market..
AMD, who has an ugly legacy of releasing enormously power inefficient GPUs on the market got a bit better with their newer hardware, but they’re still not great. People still laugh at the R9 290X/390X, which served better as egg cookers than gaming chips.. But then they decided to turn the PC market upside down with Ryzen, and more specifically, Threadripper. It’s great on paper: many cores, much performance, very many such wow. But when it comes to reliability and heat output.. Yeah, it’s not great, especially if you try to use it as a compilation station that you want to keep anywhere near your desk. It surely will help you stay in your room in winter, but in summertime it will slowly sous vide you.. Don’t get me wrong, it’s great that you can get so much performance for (eh, relatively) so little, but 64 cores won’t do you any good if they devour more watts than your mammoth GPU
Intel… do I have to go on? They’ve been stuck on the same process node for how many years now? They’re slowly moving forward, but they’ve milked Haswell and Skylake for so long that they themselves are probably surprised..
Why aren’t there more power efficient chips? Why do all the R&D seem to go into performance? Do we really want to make that tradeoff? Well, as of today we don’t have much choice… The situation is slowly improving, but will take many years to go anywhere near where it should have been by now.
And no, I don’t want my PC to be this huge 25kg mess of RGB. We have made so much progress that the fact that most of us still sports full- or mid-tower ATX cases should be a shame to all of the computing industry.
* with some other minor differences, but they didn't matter considering probably ~95% of the IP was reused