Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Video: Nvidia support was abandoned in macOS Mojave, and here's why

Nvidia Titan Xp for a PCI-E Mac Pro, supported through High Sierra

Apple's macOS Mojave is a great software update for most users. But, it isn't for those who are using an Nvidia graphics card in their Mac Pro or inside of an external GPU enclosure, so let's talk about why.

Editor's note: This is a complex topic, and AppleInsider's YouTube audience asked for a condensed version of the editorial written in January about the topic. We aren't going to rehash the entire editorial here, but the transcript for the video follows.

The beauty of the modular Mac Pro up until 2012 is that you were able to swap out the graphics cards to keep the Mac Pro up-to-date with the latest graphics rendering technology and performance, but those who opted for Nvidia cards are stuck with old macOS software, and that can be infuriating.

MacOS Mojave dropped support for new Nvidia graphics drivers, except for a couple of chipsets included in still-supported Apple laptops — all of them outdated.

In the past couple of years, external GPUs have been on the rise, helping Macs with otherwise low graphics performance get a boost for things like video rendering and gaming. For example, the 2018 Mac Mini has serious performance potential with a 6-core i7 processor that outperforms even the best CPU in the 2018 MacBook Pro.

However, its diminutive size means it doesn't house a dedicated graphics card, so for those who need the graphics performance, they have to resort to an eGPU. And with Nvidia drivers not seeing support in macOS Mojave, those who already own Nvidia cards are out of luck.

So why is there no support for Nvidia drivers? What caused this and what can you do about it? We'll tell you what you can do in just a minute, but let's go back in time and see how Apple and Nvidia's relationship fell apart.

The first Mac to include an Nvidia graphics processor was released in 2001, but Apple was still using chips made by ATI, the company that was eventually bought out by AMD in 2008.

In 2004, the Apple Cinema Display was delayed, reportedly because of Nvidia's inability to produce the required graphics card, the GeForce 6800 Ultra DDL.

Then in 2008, Apple's MacBook Pro shipped with Nvidia graphics chips that revolutionized the MacBook by taking over the functions of the Northbridge and Southbridge controllers alongside actual graphics rendering. Because of it, Intel filed a lawsuit against Nvidia, making things a bit complicated for Apple.

Nvidia processor in a 2008 MacBook Pro Nvidia processor in a 2008 MacBook Pro

Not only that, but Apple had to admit that some 2008 MacBook Pros had faulty Nvidia processors, which led to a class-action lawsuit for Nvidia and lost profits for Apple due to MacBook Pro repairs.

Around the same time, the iPhone transformed the mobile computing market and meant phones now needed GPUs, and Apple decided to go with Samsung instead. At this time, Nvidia believed that its own patents also applied to mobile GPUs, so they filed patent infringement suits against Qualcomm and Samsung, trying to get them and possibly Apple to pay license fees.

In 2016, Apple said no to putting Nvidia processors in the 15-inch MacBook Pro and instead went with AMD, publicly because of performance per watt issues.

AMD recently launched the Radeon 7, a high-performance 7-nanometer GPU that reportedly has drivers on the way for macOS Mojave Nvidia competitor AMD recently launched the Radeon 7, a high-performance 7-nanometer GPU that reportedly has drivers on the way for macOS Mojave

And now, in 2019, there aren't any functional drivers for modern cards in Mojave at all. In October of 2018, Nvidia issued a public statement stating that Apple fully controls drivers for macOS and that they can't release a driver unless it's approved by Apple.

Basically, there's no giant technical limitation that causes macOS Mojave to be incompatible with Nvidia graphics cards. Someone at Apple simply doesn't want to support Nvidia drivers, possibly because of relational issues from the past.

For the longest time, Apple's professional apps were optimized for OpenCL, which AMD cards run efficiently, and not CUDA, the proprietary framework that Nvidia focuses on. Apple wants Apple apps to run better, its as simple as that.

On an Apple support page for installing Mojave on older Mac Pros, Apple states that Mojave requires a graphics card that supports Metal. Within the list of compatible graphics cards, you'll see two legacy Nvidia cards and quite a few new options from AMD.

Blackmagic eGPU Pro beside a MacBook Pro Blackmagic eGPU Pro beside a MacBook Pro

So, from the looks of it, until Nvidia and Apple both relent, and decide to meet the other halfway, there won't be Nvidia support for eGPUs, or any Mac Pro that may or may not have PCI-E slots. And, with AMD's Vega 56 and 64 already supporting metal, and Vega 7 support coming soon to macOS, it doesn't look like Apple is in a negotiating mood.

Further complicating matters, is Apple is working on its own GPU technology. It looks like it's already in the iPhone, and it's just a matter of time until it makes it to the Mac.

If you need Nvidia, you can always downgrade to High Sierra, and hope that the two companies come to their senses, for the user's sake.



42 Comments

derekcurrie 64 comments · 16 Years

Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.

ksec 1502 comments · 18 Years

Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.

I don't see any part of the A11 GPU developed by ARM. 

dysamoria 3430 comments · 12 Years

Is this relevant to a gamer who, hypothetically, wants to play Windows games on a Windows partition on their Mac? If you have an eGPU supported by Windows... DOES Windows support eGPU?

curtis hannah 1834 comments · 12 Years

Apple seems to just use what's best for them at the time, and AMD has the better performance to power ratio at this time, and Nvidia hasn't offered Apple anything to convince them otherwise (price is still slightly higher on Nvidia).
It's weird that Apple is trying to avoid the external GPU support side of it though, but they only added eGPU support less than a year ago so its possible they will in the next year, especially if they want the new Mac Pro to be completely modular, but I wouldn't bet on it.

Mike Wuerthele 6906 comments · 8 Years

dysamoria said:
Is this relevant to a gamer who, hypothetically, wants to play Windows games on a Windows partition on their Mac? If you have an eGPU supported by Windows... DOES Windows support eGPU?

it does, but poorly. I've had the most success booting to a non-accelerated display, and playing the game on a second, accelerated, one.