Mac Studio (and Pro)

This is probably totally irrelevant to most on this forum since this is a heavy mobile computing focused community, but this is a total nope for me as the top-tier desktop computing option from macOS.

It’s cute, but it’s $2000 dollars for no dGPU, a 512GB SSD, and very few customization options.

So much nope… If I’d allowed myself to get sucked in back when Apple made real (but expensive) desktops, I’d be pretty irked. I’m definitely sticking with Windows+WSL and Linux.


But it IS cute, and magical, once you connect it to a Magic Keyboard ($199), Magic Mouse ($99), and 27" Studio Display with tilt, swivel, and height adjustable stand ($2299).

What’s your problem @Mesosphere?


Yes, but two things. It technically is not their top tier desktop, that would be the Mac Pro though it’s arguable how truly real desktop friendly the pro is given it’s size.

And also the graphics cores in some benchmarks are right with Nvidia’s 4060 and 4070 series chips.

The big issue is lack of software (games) optimized/written for those cores. I will concede that the user not being able to swap in their own graphics cards is a hard no for some.

I have the Studio, and it’s by far the best PC I’ve owned and in things I do with my personal system which is lots of photo/video/audio editing, it beats the pants off my sons core I7 13th gen 32GB RAM Nvidia 4070 TI desktop.

Horses for courses…


I didn’t spend enough time searching it appears. Thanks for the correction. The articles I came across failed to mention that one as an option for some reason.

Still, $7000…

Your son and I have similar tastes it seems =) I have the 13th gen i7 (13700k) and the 4070 TI as well (but 128GB of RAM). I definitely game on my PC, but not having a CUDA capable GPU is also a deal breaker (otherwise my GPU would probably be AMD).

1 Like

I spent even higher prices on my mouse and keyboard, so I probably shouldn’t throw stones there =P

Fascinating discussion guys, but do you mind if we split this off into a Mac Studio thread? We don’t have one as yet, and actually the comparison in graphics performance to Nvidia has relevance for a lot of users here looking to switch hardware (I’m one of them :wink: ).


In my brain this went like, “A whole bunch of my old friends I knew as Windows die-hards are switching from Windows to macOS. Should, I give it another thought? … Nope.”

So, that’s why I posted it in a Windows thread. Still, if we don’t have a Mac Studio thread, there probably should be one if others are interested.


Just wanted to mention, the sales for the high-end M2 chips are just starting:

Deals: Take Up to $300 Off 2023 Mac Studio, Available From $1,799 - MacRumors

To reference some of the benchmarks that @Desertlap was referring to:

(Tom’s Hardware)

Turning to a different comparison, the new Apple M2 Ultra’s 220,000 Geekbench 6 Compute scores (Metal) sit between the GeForce RTX 4070 Ti (208,340 OpenCL) and RTX 4080 (245,706 OpenCL). For a direct Geekbench 6 OpenCL comparison, the Apple M2 Ultra Open CL scores of about 155,000 are much closer to PC GPUs like the Nvidia RTX A5000 and AMD Radeon RX 6800 XT.

So the M2 Ultra is quite a capable compute-focused chip, however that will set you back $3700 (on sale). If you want the much cheaper M2 Max version ($1800 on sale), you are looking at around a ~30% raw performance drop:

So it’s quite a gap both in performance and cost. I’m still considering “the big switch” myself on the desktop side, but the hearing positive experiences of @Dellaster (Mac Mini Pro) for gaming and @Desertlap (on the M2 Ultra Mac Studio?) for pro use, I’m now more tempted than ever.


I would put an asterisk on this. This situation only applies when comparing with CPU software encoding (unfair, since PC also have hardware encoding options that can outstrip Apple) and/or when you run into the three encode stream limit of NVIDIA’s consumer cards (easily bypassed). However, you can enable Quadro level encoding on any consumer GeForce card using this unlocking tool from GitHub. Once the codec limit is handled, NVIDIA handedly takes the lead in encoding across the board. Plus NEVNC offers significantly higher measured objective compression quality than Apple (e.g. in VMAF, Apple M series encoder scores in the low 80s whereas NVIDIA’s hardware NVENC codec scores in the high 90s).


Gotta agree with @Desertlap on this one - out of the box experience is what matters, and if you have to use a third-party tool to unlock device performance (it was locked for a reason, right?) then it is not a one-to-one comparison (just couldn’t pull of the apples to Apples cliche).

That is interesting. When I was testing an M1 MBA 16/512 vs an Asus Zephyrus G14 AMD Ryzen 7 5600HS with nVidia 3060 16/512 (which I later upgraded to 1TB because I can), the Zephyrus was better at virtually everything except heat, fans (obviously), and battery life.

At gaming? It wasn’t even close. The Macbook Air I sold for around $400 more than I bought the Zephyrus. (And I can use my smart card to Azure VD into work)

Edited to add: and I want it to be different. Apple actually cares about user security. I have ADP enabled, and I want MS to follow–but instead we get even more intrusion by the OS.

1 Like

If you are queazy you could buy an NVIDIA Quadro (or Tesla, I assume I don’t do much encoding) card and get that performance out of the box. I run Tesla GPUs in my Linux servers, but GeForce for my desktops.

1 Like

Respectfully disagreed. Doing creative work with a device in excess of $4000 means you should be a seasoned professional if you even toying with those price points. Otherwise, you are overpaying… grossly. And if you are said seasoned professional who will use this device to its full potential, applying the aforementioned workaround is stupid simple compared to the editing and effects work you would be doing in a cut. But if for some odd reason that simple 1-minute hack is confusing, you can pay the Quadro tax as @Mesosphere indicated.


:crazy_face: :vb_wavey:

1 Like

Just curious guys, what is your target price range for a primary workstation? I’m assuming a mix or various compute tasks like: CAD modeling/rendering, encoding, neural networks/AI (sorry, it’s the future :stuck_out_tongue: ), and of course, gaming on the side.

Mine is about 3000 CAD, give or take depending on the market and specific hardware component. That works out to about $2200, which puts the M2 Ultra is firmly out of my price range, but over the base M2 Max. I’m kind of loathe to give Nvidia my money right now, which is why I’m pondering the Apple switch.

1 Like

Well, that’s the thing. I don’t have that set of needs as a primary workstation. I have simpler ones. So I would say that $1600 would be my max price point. For that I am way better off going with something that can have a separate graphics card right there. I would get something with a 3060Ti still (that’s what my desktop has) as it is fairly cheap, but it is going to kick the crap out of a Mac Mini LOL.

That said, it’s somewhat semantics. Very few people are so cross platform that it doesn’t matter which side they are on. If you need Mac, then you know that and the gaming has to take a bit of a back seat. If you need CUDA, then you are SOL with Mac.

So it’s a matter at looking over everything and seeing what you can live with. For instance, I cannot get Azure Virtual Desktop on Mac (even in virtual ARM windows) to work with my smart card. That makes it a hard stop to go with mac on my primary workstation–as I need to connect to my work servers using AVD.

Now if it wasn’t for that, I would have a much tougher call, but due to gaming I would still side with Windows + external graphics card.


I would leave brand politics out of it. I get an earful of this on Reddit and it is the same answer here. Until AMD and Intel get their respective acts together, NVIDIA has the freedom to charge whatever the market can bear meaning whatever their customers will tolerate. If you are contemplating anything AI, I would urge NVIDIA–full stop. Universities and research facilities will use NVIDIA if they rely on GPU compute for their modeling and therefore all the toolsets and libraries are built to NVIDIA. Therefore, alternate solutions involving Apple hardware will be arduous, hack-ridden or even impossible without coding your own library from scratch. Gaming? NVIDIA also… by a long shot. They have nearly 90% of the dedicated GPU market so that gives them a substantial leg-up. AMD is also a good choice for gaming since they are involved heavily in the gaming industry for consoles, but given your other needs, they are simply out of the question. The same applies to CAD as well. NVIDIA workstations are still used heavily by the bigwig VFX and animation studios like ILM and Pixar. This is probably not what you want or like to hear, but your needs align very closely with NVIDIA as your top choice here.

1 Like

It depends on whether you asked me that today, or ~6 months ago. Typically, over the years I’ve been in the ~$800-1500 range (US) upgrading every ~2-3 years. With my most recent build though I waited around 4 years, but I really went all out tilting much more strongly towards bang in the bang/buck ratio and wasting some money on flashy things I wouldn’t have done in the past.

My Current Build:

Motherboard: MSI - MAG Z790 @ $270
CPU: Core i7 13700k @ $418
GPU: PNY GeForce RTX 4070 TI @ $820
RAM: 128 GB Corsair Vengeance RGB @$330
SSD: WD 2TB NVMe SN580 @ $116
HDD: 12TB 7200 RPM Seagate Exos X16 @ $125

AIO Water Cooler: Corsair - iCUE H150i (with LCD screen, LOL) @ $280
Power Supply @ $200
Case: Segotep Pheonix E-ATX Case with Tempered Glass Side @ $180

Total Tower: $2739

Middle Display: 42" 4K LG C2 OLED TV @ $899
Right Portrait Display: 27" 1440p HP 27mq @ $259

Keyboard: SteelSeries Apex Pro TKL Wireless @ $250
Mouse: Logitech G502 X Plus HERO @ $160

Total: $4307

This doesn’t include the existing peripherals and components that I carried over from my old box like my left side 27" 1440p monitor (IIRC, @dstrauss gave me the tip for that Korean market one many years ago), some more large spinning HDDs, speakers, umbilical cables and components that connect my workstation to my entertainment center, and probably some other things I’m forgetting.

I went all out this time around like I never have before. Still, doesn’t look bad next to the top end Mac Studio or especially the Mac Pro which STARTS at 7k.


That’s a very frugal budget. I should probably lean lean too, but can you blame a man for splurging a little on his main rig?

Gaming is really only a minor consideration, but for that, GPTK/CrossOver+CXPatcher will cover pretty much all my modern gaming needs, and for the retro stuff (i.e. the good stuff :wink: ), OpenEmu is da bomb!

And I happen to be one of those users who ‘ambidextrously’ wields Mac and PC, much to my surprise haha. I thought the transition to Mac would be painful, but after about a year, I find I actually like having the app variety, different ways to slice the same problem.

But let’s say if it weren’t for the Azure smart card issue, do you think you’d go with M2 Max Studio, or the M2 Pro Mini within your budget?

Agreed. Well, I’m a customer, and I’ve decided not to tolerate it…at least for this gen :wink: . (Hey they say “vote with your wallet”, if not now at peak price gouging, then when would you actually “vote”?)

And while it’s true Nvidia has the most compute capability with CUDA acceleration, I’m no serious AI researcher, more of a hobbyist; the Apple silicon native builds of the common models should be sufficient for my dabbling.

But since Nvidia is your rec, going with an RTX card, what would your target build and price range be?

$200+ just for the keyboard; $400+ before we started on the build. You know you are dealing with a series PC user! :stuck_out_tongue:

(Actually, I shouldn’t talk, I spent 200 CAD just on a fancy-shmancy mouse, the SwiftPoint Z. And I pre-ordered the second one too.)

Thanks for your build comments. Indeed, I can see why you went traditional DIY PC…but there’s always that Apple allure tempting me.