You could argue that this is very Apple like, and it arguably is, but at the same time some of the biggest developers we work with say it’s way overdue. We’ve heard that as much as 35% of development testing QA on Windows apps is for video chipset compatibility versus mid single digit for MacOS and Android, and almost nil for IOS.
Checking the Steam Hardware & Software Survey for June we discover that Intel integrated graphics older than Xe Graphics (which itself has 1.02%) total 6.37% gaming usage and each one without exception is on the rise. Same for AMD iGPUs. For comparison, the single most popular discrete card, the GTX 1060, is at 6.82% gaming usage and falling. Draw your own conclusions.
Edit: I should mention that the percentages are of a 100% pie, so gains across the board by integrated graphics necessarily means losses across the board for discrete graphics. i.e. currently people are increasingly gaming on iGPU systems rather than dGPUs.
True and of course the Windows user base is orders of magnitude large than Mac OS. I was just pointing out some of the things that developers share with us.
I would argue though, that backwards compatibility is at least as often an albatross as it is a benefit. And the sheer size of the PC market is the reason that many put up with it.
And I think there is an analog to the IOS versus android market and why for instance there are far more paid successful games on IOS versus Android, where only the very biggest tend to make it to Android at all.
And BTW; the freemium model on Android has been far less successful on other platforms. I honestly don’t have a real way to gauge all the factors involved there, other than developers have shared that there is significant reluctance to paying for an app on Android that they don’t see on IOS and that Windows falls somewhere closer to IOS than Andoid
I’m not really making a judgment here, just pointing it out. I don’t think the lack of driver updates is going to affect older integrated graphics anyway since the people who are using them are also playing older games, which are not being updated or otherwise changing so you don’t need new drivers.
My point, perhaps not clearly stated was the amount of resources dedicated to testing specifically for graphics card compatibility is quite significant and for all intents and purposes, unique to PCs and is a “pain point” with many.
BTW: I do think its an effort in-benefit out scenario though. We actually tested out an X box X for a custom application with a customer, but while we were doing it, I brough it home and my son and I compared a handful of games that exist on both Xbox and PC.
My son has an upper end graphics card (GTX 3070) and it was striking how much better a few of the games looked. Of course the flipside is how “plug and play” an xbox is compared to a PC…
OTOH; I also got an appreciation for powerful this generation of consoles are, when paired with optimized apps
Depending on the game, developers are console–first and PC GUI and graphics suffer for it. That seems to be not as bad as it used to be (e.g. Star Wars: Knights of the Old Republic :shudder:), and if a game is targeted to a recent console that just came out in the last couple years then it should be close to parity with PC capabilities, unlike consoles that are five or more years old. YMMV.
Things not looking good at Intel (not on the fab side, at last).
Optane dead. Hot and power hungry CPUs. Rumours of serious considerations of cancelling ARC entirely, and you can’t even really say that’s even launched yet.
I have heard this as well. However, I would place the failure of their GPU development squarely on Raja Koduri’s doorstep. I still remember an exposé on Reddit about the man misusing funds for festivities and frivolities and when it came to lead in the day-of-day business, falling well short of the mark in providing expertise beyond surface-level platitudes of industry terminology to put on airs.
Oops, I think I meant to most that in the other thread, but the graphics is still relevant, lol.
@Hifihedgehog Yeah, I think we discussed this in another thread, and of course you need to take rumours carefully, but Koduri left AMD acrimoniously and very clearly an attempt was made to make his exit as quiet as possible. I wonder if Intel will keep him on? If not, does he have any bridges left?
If I have learned anything, executives can always hop from one company to another unceremoniously and keep fooling companies one after another and make a killing while doing it. I have seen it in the healthcare industry time and again where totally unqualified executives misrepresenting their qualifications in their hiring process dupe the powers that be, score their big bucks for a brief stint, and finally move on to the next salary and compensation trough after the damage is done. And once they are bored with executive work and being mindless talking heads, they turn to book writing to lecture folks with the same substance-less fluff.