Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

AMD Radeon RX 6700 XT GPU may launch at March 3 event

AMD will be expanding its Radeon RX 6000 graphics card line on March 3, with the launch thought to introduce lower-priced options that could eventually be used with the Mac Pro or eGPU solutions.

The first Radeon RX 6000-series GPUs were introduced on October 28, with the RX 6800, RX 6800 XT, and RX 6900 XT unveiled by AMD. In an announcement made on Twitter, AMD plans to add more cards to the range.

The tweet proclaims "the journey continues for RDNA2" at 11 A.M. Eastern on March 3. The event will be the third episode of AMD's "Where Gaming Begins" virtual event streams, and is confirmed via the tweet to include at least one new card.

Current speculation is that AMD's next card will be one that extends the range down toward the value end of the spectrum. Thought to be the Radeon RX 6700 XT, images and details of the cards leaked by VideoCardzon Sunday indicate Asus is already preparing versions for sale.

The RX 6700 XT is believed to use the Navi 22 XT GPU, with a reduced 40 compute units versus the 60 to 80 units offered by the RX 6800 and RX 6900 XT. The core count will also allegedly be reduced to 2,560, well below the 3,840 of the RX 6800.

The card may also switch out the 16GB of GDDR6 memory for 12GB as a cost-saving measure and use a 192-bit memory bus instead of 256-bit. A lower memory bandwidth may also be revealed, with it being capable of up to 384GB/s rather than 512GB/s performed by its stablemates.

It will likely provide some power savings, with a power consumption of up to 230W undercutting the RX 6800's 250W power draw or the 300W of the higher two cards.

Rumors point to a launch date in March 2021 for the card, and potentially a price below $500. For comparison, the RX 6800 was priced at $579 at launch, while the RX 6800 XT cost $649, and the RX 6900 XT was $999.

The card is likely to be a potential future upgrade for Mac Pro users and those with Thunderbolt 3 eGPU enclosures, though not at the moment. Drivers do not yet exist for macOS for the RX 6000-series cards, but are anticipated to arrive soon, given historical trends for addition of support.

Apple's decision to drop Nvidia GPU support in favor of Radeon gives upgradable Mac users few video upgrade options. The launch of more affordable top-end Radeon cards may help press Apple to support them.



11 Comments

cloudguy 4 Years · 323 comments

"with the launch thought to introduce lower-priced options that could eventually be used with the Mac Pro or eGPU solutions."

Wait what? The Apple switch from Intel CPUs and AMD GPUs to their own integrated CPUs and GPUs lasts two years. It started in November 2020. Meaning that there is only 18 months left. While Apple is rumoured to have one final run of Intel-based Mac Pro and iMac Pro workstations on the way while they work out the bugs with the M2 and M2X CPUs that are capable of replacing the Intel Core i9 and Xeon CPUs that currently inhabit them, you would be absolutely nuts to actually go out and spend all that cash on one. Case in point: the resale value of Intel-based Macs is already plummeting! 
https://www.zdnet.com/article/the-mac-price-crash-of-2021/ 
So even the idea of buying an Intel Mac now, using it for a couple of years and then flipping it while in still "like new" status and using the cash for an Apple Silicon Mac Pro in early 2023 makes no sense, and getting the AMD eGPU with it even less so. 

Apple has finite resources and organizational focus. Realize Macs have been playing second fiddle to iPhones and iPads for awhile now because those units bring in far more money. Indeed, these days Macs probably come in fourth behind services (which also generate far more revenue than Macs) and wearables (where the Apple Watch and the AirPod give Cupertino rare market place dominance). So you would be absolutely nuts to spend $6000 or more on a device running an outdated software platform. Proof: because you won't do it. No, you want everyone else to do it so Apple's market share doesn't take a nosedive. You want them to be the ones to "trust Apple" when they spend 10 seconds reading boilerplate teleprompter-speech on how they will continue to offer world class support to their Intel-based Mac customers ... before they pivot to talking about how the future is ARM and everyone on x86 is going to be left behind. (And given that Apple is the only company with viable ARM PC and workstation products and will be for the next 3-5 years ...)

Yeah, no. Only buy Apple's last batch of Intel-based machines if you are 100% comfortable replacing macOS with Windows 10 Workstation Pro or Ubuntu on them 3 years down the line. The former isn't that much of a problem - bootcamp will still be supported for awhile - but drivers and such will continue to be a big deal for the latter. Otherwise your platform is going to be an afterthought for a company that has the vast majority of is consumers and market share elsewhere (mobile, services, wearables) and regards your obsolete hardware platform a burden and a chore to develop for. 

aderutter 17 Years · 625 comments

Plenty of people buy Macs and write therm off through depreciation in less than 3 years, so I’d say “only buy Apple Intel-based machines if you are 100% comfortable seeing them as disposable assets with practically zero resale value in 2 years time”.

Top of the line graphics cards are a dirt cheap upgrade to some people but I can’t see Apple considering drivers for AMD cards as a priority.

OutdoorAppDeveloper 15 Years · 1292 comments

Right now, no one is talking about Apple's upcoming discrete GPUs. Apple has not announced them but they pretty much have to be released this year. The release of Apple discrete GPUs will be an extremely important event. It is clear that Apple can compete with Intel/AMD in CPUs but how will it compare on GPUs? Apple's embedded GPUs are at best 1/10th the speed of discrete GPUs. That's actually pretty impressive for a mobile GPU built into an iPhone or iPad but it is not going to impress anyone buying an iMac, let alone a Mac Pro. The only other choice Apple has is to write Apple Silicon drivers for an AMD discrete GPU. That seems counterproductive given Apple's stated intention to build its own CPUs and GPUs going forwards.

auxio 19 Years · 2766 comments

cloudguy said:

Wait what?

... a bunch of other "Apple doesn't care about Mac users" nonsense ...

Yeah, no. Only buy Apple's last batch of Intel-based machines if you are 100% comfortable replacing macOS with Windows 10 Workstation Pro or Ubuntu on them 3 years down the line.

Put your money where your mouth is.  I'll wager you $10k that Apple supports Intel-based Macs for at least 5 years.

For professionals (including myself), it's expected that you'll need to upgrade within 3-5 years.  Heck, people don't even keep cars (a purchase around 10x more expensive than the average computer) for much longer than that these days.

zimmie 9 Years · 651 comments

Right now, no one is talking about Apple's upcoming discrete GPUs. Apple has not announced them but they pretty much have to be released this year. The release of Apple discrete GPUs will be an extremely important event. It is clear that Apple can compete with Intel/AMD in CPUs but how will it compare on GPUs? Apple's embedded GPUs are at best 1/10th the speed of discrete GPUs. That's actually pretty impressive for a mobile GPU built into an iPhone or iPad but it is not going to impress anyone buying an iMac, let alone a Mac Pro. The only other choice Apple has is to write Apple Silicon drivers for an AMD discrete GPU. That seems counterproductive given Apple's stated intention to build its own CPUs and GPUs going forwards.

Your "1/10th the speed" statement is incorrect. The M1's GPU can perform 2.6 TFLOPS with eight cores. That's 325 GFLOPS per core.

The Radeon RX 5700 XT 50th Anniversary can get 10.1 TFLOPS with 40 compute units, so 253 GFLOPS per compute unit. This is actually the best performance per compute unit across the Radeon RX 5000 line.

The Radeon RX 6000 series is technically released, but they're extremely rare right now. I'm not going to count them until stores can go longer than an hour without selling out. Even so, they get 288 GFLOPS per compute unit.

GPU compute performance scales linearly with core count. Power performance scales a bit worse because high-core interconnects have to be more complicated, but not hugely so. 32 of Apple's A14/M1 GPU cores would get about 10.4 TFLOPS, beating the best AMD consumer cards you could get last generation and beating the Nvidia RTX 2080 (also 10.1 TFLOPS). That would still have low enough power draw to fit in a laptop, though admittedly, not a laptop Apple is interested in making. An iMac could easily have four times the M1's GPU.