This is an interesting problem from a personal perspective. My wife loves her LG Gram 17 with WIndows 10, and DETESTS changes (“Where the H*** did this or that go?” “Why can’t I still do it this way?” “Why did they change the right click menu?” ad infinitum) and frankly I have to agree with her - I’ve gotten used to Win11 but it is NO GREAT IMPROVEMENT over Win10 to me. To her moving to Win11 is as bad as moving to a Mac -
One other thing. In some ways this was inevitable for a couple of reasons. First is the abysmal approach to security that ALL tech companies have had to security and reliability virtually since the inception of the PC.
In other words the “move fast and break things” mentality writ large.
The second is something that all industries have faced and perhaps the PC industry avoided it longer than most which is the “good enough” conundrum.
In other words, if you take something like a dishwasher, consumers are long past buying a new one because it does something new or better than what they have. They are buying a new one to replace a no longer functioning one.
The PC market has long relied on the idea that a PC was never fast enough or capable enough to do what was needed/wanted to drive new sales, but for many, perhaps even most, that’s no longer the case and it’s only obsessives like us that feel compelled to keep chasing the new thing…
BINGO! And I am one of the worst offenders (MORE POWER).
In fact, I dare ALL OF US to look back over the last 40 years of tech articles reviewing new processor hardware - how often do you see the phrase
“It is good enough for web browsing, email, productivity and light gaming…”
as each new generation of chips emerge - yet we have moved from the 80X86 to i9/M2 Max Pro in that time frame, yet we still have this lowball benchmark which is what I would bet 90% of users need. Sure, software has bloated tremendously over that same time, but still the same functionality.
@Desertlap is right - we may have already surpassed the good enough toaster/washer/fridge point where we just need third party help to swat away the hackers…
As a person who is very sad that his Samsung Galaxy Book 12 will no longer boot due to the hard drive completely filling up because of Microsoft’s attempts to update it beyond Windows build 1703, the solution here should be obvious:
separate the GUI and underlying OS
The underlying OS always gets updated out of security concerns, the GUI only gets updated when the user chooses to.
Microsoft only has to test/maintain the current OS layer and as many GUI layers as they are willing to create, which should be manageable.
It isn’t a concept as much as a functional effect of the design. It’s not without it’s downsides though as for instance there is a performance penalty with it as well as inconsistency in app support to such as with GIMP which looks, has feature and performance variations between different distros such as Mint versus Ubuntu.
@Bloodycape is also correct that Android does it as well (though 9 was when it was fully decoupled). Interestingly they did it to better support different chipsets (Exynos, Snapdragon, MediaTek) initially about it also had the downstream effect of making security patches etc. easier to deploy.
It would be orders of magnitude harder to do on Windows, due to DLL"s shared frameworks, device drivers as well as Windows for better or worse is a one stop shop for all resources an app might need, but also one that MS fully uses for its own apps and services.
Eg. why for instance uninstalling/disabling EDGE breaks the online help and in some cases the display rendering in many apps.
PS: the actual genesis of this on Linux was because Torvalds wanted Linux to be more secure for plans he had to sell it to various government and large business that needed higher than usual levels of security (an early customer was Goldman Sachs for instance).
This also replicates long standing practice in mini and mainframe computing solutions from IBM and though they are long gone DEC, who more or less pioneered it as a competitive cudgel against IBM. I was part of the multiple demos where DEC would deliberately crash multiple apps on a system and the rest would continue on.
IBM’s most popular mini system at the time, OTOH, could be brought to a grinding halt by crashing their HR or payroll systems.
Microsoft IS separating the UI and underlying OS already (they’re also rewriting the kernel in Rust). But they also can’t maintain a single UI consistently, there’s no chance they’d want to do multiple UIs.
It also doesn’t serve investors’ profits to let people linger on an older version that is maintained with security updates so there’s not a chance Nadella would let that happen.
Interesting analogy, but after considering for a bit I don’t actually think the PC will ever hit the level of transparency of ‘household appliance’.
Consumer TVs have been around for about a hundred years, and while very close to appliance status, still you have a sizeable market upgrading for new and shiny features—we are visual monkeys after all.
Computers are TVs, plus typewriters, plus calculators, plus telecommunicators, plus game consoles…plus whatever new tech platform gets invented, so in short, they are constantly are re-invigorated in the consumer mindset.
That’s probably why there are whole groups of people who’ve devoted their digital lives to a single niche of slab devices, that only they and maybe one other kid at school used.
I think you are falling in to the “me” mindset. I was speaking of the larger and broader market and support for my supposition is found in myriad ways such as MS difficulties in getting people off Windows 10 or even Windows 7.
OTOH, the key difference with a PC that makes it not like a dishwasher is that it is a far more multipurpose device and data I’ve seen shows that especially in retail, a new use/need is what drives an upgrade cycle such as buying a new digital camera.
And of course, we here are always looking for greater/better functionality and we share that with gamers.
But for IMHO a majority of people that need to do online banking, streaming netflix, doing the occasional book report or tracking household expenses, the pace of genuine noticeable improvements has slowed significantly
eg. there is not a need/desire for many (most) to spend the money for a device that opens their household budget spreadsheet 1.5 seconds faster.
There is also strong data that a need/desire for additional functionality is more often expressed where a user buys a tablet or laptop to supplement, not replace their desktop/laptop.
TLDR my analogy is quite valid for many users in that PCs are in essence an appliance to perform tasks albeit a more multipurpose one than single function one like a dishwasher