This Verge writer gets so many things wrong it makes me want to throw something at my screen.
This is the first and perhaps most egregious…
" In any case, that’s not what Apple’s doing here — this crop to 2x uses all 12 megapixels at the center of the sensor, so you’re getting a full-resolution image, just without the pixel binning tricks that are available when the full width of the 48-megapixel sensor is used."
So to start with a center crop from a 48 MP sensor is inherently not a “full resolution image”, it’s a reduced MP capture from a section of a larger sensor.
Second, she has no clue what “pixel binning” is. First off that’s an abused term that isn’t technically correct. Second. combining pixel data for various reasons such as to improve low light capture has been around since the dawn of digital photography.
And in fact, it there was an analog equivalent, albeit very clunky during film days with multiple exposures and/or stacking negatives in when making a print. Something that Ansel Adams was a master of.
My point is that this ■■■■ info from a "presumed “expert” leads less technically savvy people to all kinds of incorrect conclusions, such as “pixel binning is bad” though I don’t even like to perpetuate the term.
Ok, end of rant, but in my defense, it’s been awhile since I’ve done so
It’s getting where the tech press is as generally clueless as the regular press…and you are so right, there is no way for the public (layman like myself) to know the difference. I also see it as a total lack of editorial supervision - upload, someone runs spell/grammar check (at best) and push the publish button. Editors and publishers like Bill Machrone and David Bunnell (PC Magazine) have to be rolling over in their graves…
Interesting you should bring up Machrone. I was at a press event years ago when intel released the first Core I chips. During the QA multiple press members kept asking inane and obviously misinformed questions related to how turboing and multithreaded worked.
At one point Machrone stood up and directed his ire at someone from another tech magazine and in essence said. “Sit the f&&& down and let someone who actually has a clue about what they are talking about ask a relevant question”
I wasn’t aware of that. perhaps she left before the latest incarnations of it in smartphones which apply a lot more AI derived intelligence to it.
As to the GM1, an outstanding camera especially for its time IMHO. My semi pro photographer brother-in-law used his for years. It was exceptionally well suited to wedding photography because it was quiet and small and produced great JPEGS on default settings
I guess I would say it isn’t necessarily the best choice of words but not totally wrong in the first example. For example, I took “full-resolution” to mean that it isn’t an upscale of a lower resolution image, and not with reference to an image making coverage of the full width and height of the camera sensor. Likewise, the use of pixel binning is antiquated and so inaccurate. There is a reason Apple makes no mention of it anywhere on their website, because they don’t use it. It is AI-based, yet the tech press insists on applying other familiar and incorrectly overused technology to explain it.
Moreover, when cropping an image obtained from a bigger sensor the zoom distance you get is larger than the one you obtain when pixel binning the full resolution image. This means that these two effects are not equivalent.
The writer is really taking a shot at Intel, but without the actual “facts” to back it up.
If the writer did some basic research, they would find out for instance that smaller OLED panels are actually more expensive to produce than larger ones (up to a point).
And if you read it you also find out that the 14 inch has Thunderbolt whereas the AMD just has USB. Yes, that does add cost, but it also adds functionality too and there are standalone TB controllers available to OEMS if they choose to include them.
TLDR: Even though they are both Acer Swift’s they are different computers intended for different buyers, full stop. So, saying the cost difference is due to the “Intel Tax” is misleading and irresponsible IMHO.
Not to be pedantic or fastidious because everything else you said makes sense here, but they are comparing an OLED display-equipped Swift Edge with an IPS display-equipped Swift 5. A 16-inch OLED panel costs significantly less than a 14-inch IPS panel. That’s their point: you get a larger display that’s OLED on top of it compared to the smaller one that’s just IPS on the Swift 5.
Unless you have a source(s) that I don’t , (in my case. Samsung. Innolux and LG) a 13-14 inch OLED panel is almost 50% more raw cost than a comparable IPS. and the sole source I can find for a mini LED , essentially splits the difference between the two.
And I will leave the discussion about if OLED or IPS is superior to another thread. They both have significant up and downsides IMHO
EDIT: my points were the Author wasn’t making an Apples to Apple’s comparison as the two systems most certainly have different target markets, and thus it’s a false comparison used to attempt a pot shot at Intel.
And BTW: we have tested that raw AMD chipset and it’s a competent but not outstanding performer. OTOH OEMs seem to discount AMD systems more than Intel.
I agree! I think we are both on the same page as far as smaller OLED panels costing more than larger ones. The point they were making is you get a larger panel with a costlier and (in some respects, anyway) better underlying technology (OLED, instead of IPS) with Swift Edge than the other Swift 5, which does make sense. As far as the silly “Intel tax” which I am totally on board with you on, there are other reasons such as the Thunderbolt technology costing significantly more in R&D and BOM. For example, that’s why just to get a Thunderbolt 3.0 (or USB 4.0)-equipped AMD Ryzen 7000 motherboard, you have to pay significantly more. You have to step up from a $300 motherboard to a $600 one with the same flagship chipset just to get USB 4.0/Thunderbolt 3.0. Why? There is just so much more testing involved and additional supporting circuitry required even if the processor itself has built-in USB 4.0/Thunderbolt support on the CPU package. Like you say, this isn’t Intel tax. It’s premium feature set tax. There’s no free lunch here when adding Thunderbolt support. That’s the utter stupidity of Ars Technica’s article.
Yup. more misinformed FUD. With the release of Windows 11 and the vastly improved X86 emulation (not to mention 64 bit support) we have not come across any reasonably modern app that we couldn’t run on the Pro X.
And I think I’ve mentioned this at least once before in another thread, but MS has been steadily working on improving X86 emulation, with change in nearly every cumulative update.
In fact our message to customers has been if it’s been more than three months or so since you tried a specific app, you might want to try it again on a fully updated system.
No excuse, an Engadget writer should know better, but I suspect the legacy of Windows RT is to blame. It left quite an impression—of the wrong kind—and Microsoft hasn’t gotten the message out that things have changed with WOA for the better. I wouldn’t know better if I wasn’t an obsessed gadget geek who keeps up on these things.
Eh, Engadget have never been known for their accuracy and detail. I’ve only ever gone there because they are still one of the only sites that is reliable enough to gather gadget and technology news in a timely manner.
The ‘better’ writers had a spat* years ago and left to form The Verge and Vox. Now Engadget is mostly press release rehashes, which honestly is fine for me.
*It was rather amusing how bad the comments section got, to the extent they just disabled it.
Edit: And looking back it was the time that the entire front page (and even second page if I recall correctly) was iPad (or Apple) articles. The editor who shall not be named did not take the criticism well.