This is such a bad idea for so many reasons. Apple may not be able to block this on Mac OS like they did a few years ago with an app for IOS that basically wrecked Iphone X displays.
At a minimum I expect Apple would deny warranty claims if they found the user had this installed.
I get that in theory some believe we should be able to do what we wish with our own devices and I agree to a point, but Apple shouldn’t be on the hook for user stupidity and unfortunately this is also the kind of thing that makes it just a little more expensive for the rest of us.
And before anyone accuses me of being chicken little, I tried this on our lab MBPro 14. It works as described and it looked amazing when I tried it out on our notably harsh sunlight patio.
However I also saw that the thermal limits for the display were exceeded with this app active in less than 8 minutes and it appears it somehow bypasses Apple’s management software that would normally shut the system down to protect it.
But It’s out there for anyone dumb enough…
App lets you crank the new MacBook Pro’s brightness to over 1,000 nits | Ars Technica
BTW: This reminds me of a thing a few years ago with some Honda/Accuras where they discovered that the car was electronically limiting maximum RPM of the motor. Somebody found a hack to disable it, and the of course dealers started seeing blown engines
NO FOLKS - JUST PLAIN NO! Think of the early days of overclocking and melted silicon.
STRIKE THREE - You are OUT ArsTechnica:
“According to the FAQ on the application’s website citing Apple documentation, using Vivid isn’t likely to pose any risk to your hardware. And its impact on performance is relatively small. However, running your laptop at twice the usual brightness all the time will unsurprisingly have a large negative impact on battery life.”
Remember our thread about crappy/sloppy journalism a week or so ago - here goes the formally best technical site QUOTING THE DEVELOPER’S FAQ on safety of the developer’s utility. What a crock of sh*t!
I’d be dumb enough to try that. But I’ve been chiming for 1000+ nit devices for a while. It would be the dream that was too good to be true.
However since its a matter of thermal limits being the issue, could an external cooling methods work to make this remotely practical? Or at least prolong the time before the system just fails and or melts? Like if someone tried this outside but during a very cold 20F winter day. Or if you had a whole bunch of fans blowing at it, or set it on top of a (Covered) ice block or something. Or perhaps something more extreme like Liquid Nitrogen cooling. is jury rigging a 1000 nit screen really any worse then overclocking to 7GHz.
@darkmagistric given that the MB 14 and 16 are using new tech (mini led) my response is partial speculation. However since there is still LCD for the pixels I’d expect two things both which could occur relatively quickly.
First would be the high likelihood that individual LEDs would be driven to rapid failure. LED’s have a limited brightness range of operation and all of the HDR setups in televisions we’ve seen so far achieve the higher brightness by essential driving the backlights and/or OLEDs in specific areas of the display at absolute maximum brightness (or even beyond) but they only do it very brief periods.
In other words there is a trade off between brightness and longevity of the LED elements and conventionally companies have kept those levels in the “safe” range. There is not 100% agreement as to what constitutes “safe” though, with companies like LG (who allegedly is the OEM behind the MB Pro mini led) pushing it more than say Samsung. And these limits have shifted over time due to advances in LED tech such as the “purity” of the main element.
The second and to my mind far more important thing would be the thermal effects on the actual LCD part of the display. Above spec heat will over time do variety of things over to the integrity of the elements including slowing the ability to change state and reducing the relative contrast range of the elements. You can see these effects over time, in for example, older LCD watches or even something like a Nintendo game boy where a “new” one looks considerably brighter, has more contrast and has faster pixel change.
In other words thermal effects on LCDs are cumulative and eventually the display itself can become inert. One analogy my former boss used was that it’s like cooking rice, where if you cook it long enough and at a high enough heat, it becomes like cement.
As to possibly cooling it in some way, yes I think it’s technically possible, but would be very difficult as at best the cooling right now is always on the back side of the backlight and thus would only be 50 effective regardless. And putting anything on the front side would impact light transmission and thus defeating the purpose.
As to operating at significantly lower ambient temp, perhaps but that would introduce other issues such as being hard on the battery or even the user as I think it would be beyond most users expectations to operate their systems while wearing winter clothing
Couple other things while I’m on the topic of heat and LCDs.
I just saw a more accessible real world example of what heat does to LCDs when I went to an ATM just now. This ATM sits where it is directly facing the morning sun, and when I used it just now, it was washed out looking and you could actually see the pixels change.
All of the above is why all of the display manufacturers recommend that you only use as much screen brightness as you really need so that the display will have a longer useful life.
Wonder why I’ve never suffered from this problem - oh, right, probably need to keep a device longer than 12-18 months.