Intel News

Because what we need/want are hotter mobile processors… :rofl:

Though to be fair, they are likely to be quite fast, great for a gaming oriented laptop

1 Like

Except that games are most often bottlenecked by the GPU, not the CPU.

1 Like

Generally true but Civilization 6 and Stellaris, both of which my son and I play also pound the CPU as well

1 Like

Yep, that’s why I put “most” in there. Strategy games are exceptions. Then again, does anyone really need Intel desktop 12th gen CPUs to play Stellaris? I somehow do fine with an 11th gen mobile H35 on my SLS. But maybe I’m not a hardcore Stellaris player. I don’t mind waiting a little bit longer between turns. :smile_cat:

I’m fine with my 11th gen core I5 with Stellaris , but my son with his 11th gn Core i7 complains…

The other thing that should help a bit is improved PCIE and revamped cacheing in the 12th gen which can speed up dedicated GPUs as well due to improved transfer speeds.

Of course we are talking about a handful of FPS, which I can’t see, but my son certainly seems to be sensitive to.

Regardless we are long past the days when a next generation chip provided a major boost over the previous. And I think that Apple is going to experience that as well with the M2 and beyond.

PS; One of the reasons we play Civ is that his much. younger, faster reflexes are of no benefit to him, unlike say COD

1 Like

From what I have read nobody who knows anything is expecting more than an efficiency bump for the M2 with an Intel-like generational increase in performance.

And I totally understand about the reflexes! My nephew destroyed me back when I dared to play Nintendo with him many years ago and I haven’t been so foolish since.

3 Likes

The 12th gen “U” SOC is staring to show up in retail devices. Early indications are not impressive. As NotebookCheck remarks, on the bright side it means you can get an excellent deal on an 11th gen device without missing much.

1 Like

How does power efficiency look compared to the 11th gen?

It looks like they’re still in the middle of their review on that particular device. No power results yet, at least nothing they want to publish with that initial article.

Yeah I’m interested as well as we haven’t formally tested any 12th gen systems yet. In theory 12th gen is supposed to be more efficient that 11th gen and any given TFLOP performance level. What’s we’ve heard is that it goes away as you hit the upper end of the curve and the heat that has been talked about comes along with that as well.

1 Like

So any analyst that didn’t see this coming, probably shouldn’t be an analyst IMHO. The pandemic and all it’s additional impacts created a “bubble” for technology across the board as both companies and individuals adapted to remote work.

OTOH, this is likely yet another reason that MS decided to start releasing “new” versions of Windows, as that’s always been a sure-fire way for the OEMs to goose sales.

Intel Takes a Hit As Consumers Stop Buying PCs Amid Downturn | PCMag

1 Like

I missed this as well, until I saw this.

Intel Kills Optane Memory Business Entirely, Pays $559 Million to Exit | Tom’s Hardware (tomshardware.com)

This is a genuine loss IMHO, as the tech is/was really innovative, of course Intel seems to have done their level best to sabotage themselves in the market.

Maybe Samsung or Micron, will pick up the pieces and do it justice.

2 Likes

So we contacted Intel about a couple of specific items about Optane and apparently even many Intel folks didn’t know that it was axed :frowning:

The biggest disappointment around this for me was the possible loss of one of the original design goals of Optane which is a single pool of memory/storage that could be dynamically configured and managed by the OS dependent on the needs at the time.

In other words, for example a one TB total device, where 8-16-32-64-128 GB etc. could be used as RAM at any one time.

Additionally Optane held the promise of REAL instant sleep/hibernate functionality.

We’ve seen tech demos of this (running Linux alas) but have never seen actual production system.

3 Likes

Maybe Apple will buy the IP and release it as Magic RAM? :wink:

3 Likes

That’s not as much an “out there” idea as it would appear at first glance. It would make a ton of sense for a future iPad Pro revision , where it would put the whole RAM issue to bed and might actually encourage more powerful iPad “pro” apps.

Not to mention that Apple already purchased a big chunk of Intels’ modem business previously , so it’s not unprecedented. OTOH, apparently that hasn’t gone all that well as they have yet to release a 5g modem of their own.

2 Likes

That thought was in my head as I wrote, and though it was in jest I thought it would be nice if it actually came true, unlikely as it seemed.

With today being yet another of multiple times so far this year of AMD surpassing Intel in market capitalization, this is an excellent summary from Reddit on the Intel situation currently that provides some valuable perspective on the internal and external factors all playing into this:

It makes all the sense. Intel’s Sapphire Rapids Xeon was meant to ship Q4’21, instead it will ship Q2’23 assuming they aren’t forced to do another respin of the silicon (which adds another 3 months). It’s using the same P-core as in Alder Lake so clearly it’s not the core itself that’s the problem. Intel built itself on Xeons, it’s a core product. So given that it’s going to be 1.5 years late and the rumor that it’s on its 12th stepping is true, then that is a very very bad sign.

Intel also recently announced it was officially giving up on Optane, and while people saw that coming for years it’s still a major write down. Before that Intel already announced it had sold off its NAND fab + SSD business to SK Hynix, which completes in 2025. Apple, formerly a customer of Intel’s highest margin parts, is also still busy divesting itself of Intel chips. QNAP has been playing around with AMD chips for a very long time, but after the defective Atom bug bit Synology hard they are now adopting AMD chips in products too.

Then there’s the string of defective silicon issues. Intel’s I-225V chips were the first 2.5Gb/s consumer NICs, used in everything from NUCs to even AMD motherboards. The first two revisions were defective and required RMAs, the third revision over a year later seems to work. There was the C2000 Atom bug before that. The Puma 6 chipset bug before that. The P67 chipset SATA 2 controller failures before that.

That’s not even getting into the corners Intel cut such as the LGA1700 bowing problem just to make the socket BOM cheaper, or the brute-forced Rocket Lake generation that was a perf regression in many workloads and by all accounts should never have been launched.

Intel has had issues delivering working networking hardware, can’t get it’s core Xeon chip out the door leaving AMD free to run amok, and today GN revealed just how bad Intel’s Arc graphics drivers are for simple, basic usability. If the things I read on here about Intel having laid off a pre-silicon validation team under Krzanich are true (I already knew Intel had reduced pre-fab validation testing going into 10nm, but not the firing part) then Intel severely undermined its own internal processes for success just when it needed them the most. Intel can’t just flip the switch to put them back in place even with an engineer as a CEO again. Expertise is hard to replace and I’d bet many didn’t go back.

Intel just posted it’s first net loss in over 30 years. Intel even just reduced its fab expansion plans by $4 billion, simply so it could pay $1.5 billion in increased dividends to shareholders for Q2. It may take years before Intel resolves its internal systemic issues at the rate it’s going.

1 Like

Greed often comes back to bite you eventually.

It seems Intel haven’t even realised that judging by the last paragraph though.

So this is an interesting approach to attempting to explain why your shiny new thing doesn’t perform up to expectations, call it old tech…

As if that’s ever been a successful strategy :laughing:

Intel Warns Older Games May Take Performance Hit With Arc GPUs | PCMag

OTOH, the number one complaint we hear from our developer partners is the pain of legacy support…

1 Like

Meanwhile, AMD literally out of nowhere has fixed OpenGL support on Windows and now the drivers there perform better in OpenGL than Linux. I cannot underscore how unexpected and mind-blowing this is. For years now, people had laughed at how bad of an experience OpenGL was on AMD graphics. If you used OpenGL for one reason or another (such as Minecraft or video game emulators), you were forced to either (1) buy an NVIDIA GPU (this was a primary consideration for me in purchasing my RTX 3090) or (2) install Linux because the performance and compliance was so bad on Windows. Since OpenGL is an old and largely abandoned API in gaming, no one thought AMD would ever invest in fixing support this late in the game. Yet they did. This is the polar opposite of how Intel is now approaching their emergence into the dedicated GPU market and it may very well lead to their downfall:

6 Likes