Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Amazon introduces native Mac instances for AWS, powered by Intel Mac mini

Last updated

At AWS re:Invent, Amazon introduced new Mac instances for its Amazon Elastic Compute Cloud (EC2), enabling developers to natively run macOS in Amazon Web Services for the first time.

Announced late Monday, the new capability harnesses Intel-powered Mac mini hardware to run on-demand macOS workloads in the AWS cloud.

Developers building apps for iPhone, iPad, Mac, Apple Watch, Apple TV, and Safari can use the service to provision and access macOS environments, dynamically scale capacity with AWS, and take advantage of pay-as-you-go pricing, Amazon says. Basically, app makers can create and test in the AWS cloud. In addition, customers can consolidate development of cross-platform Apple, Windows, and Android apps using Amazon's cloud.

"Apple's thriving community of more than 28 million developers continues to create groundbreaking app experiences that delight customers around the world," said Bob Borchers, Apple's VP of Worldwide Product Marketing. "With the launch of EC2 Mac instances, we're thrilled to make development for Apple's platforms accessible in new ways, and combine the performance and reliability of our world-class hardware with the scalability of AWS."

The system integrates Mac mini devices running Intel's 3.2GHz Core i7 CPU and 32GB of RAM. Mac's built-in networking hardware is leveraged to connect to Amazon's Nitro System to provide up to 10 Gbps of VPC network bandwidth and 8 Gbps of EBS storage bandwidth through Thunderbolt 3 connections.

Amazon is entering an arena dominated by small companies like MacStadium and Mac Mini Vault. If they opt for Amazon's solution, developers will be granted access to more than 200 AWS services including Amazon Virtual Private Cloud (VPC), Amazon Elastic Block Storage (EBS), Amazon Elastic Load Balancer (ELB), and Amazon Machine Images (AMIs), not to mention the sheer scalability offered by a large cloud provider.

"The speed that things happen at [other Mac mini cloud providers] and the granularity that you can use those services at is not as fine as you get with a large cloud provider like AWS," VP of EC2 David Brown told TechCrunch. "So if you want to launch a machine, it takes a few days to provision and somebody puts a machine in a rack for you and gives you an IP address to get to it and you manage the OS. And normally, you're paying for at least a month — or a longer period of time to get a discount. What we've done is you can literally launch these machines in minutes and have a working machine available to you. If you decide you want 100 of them, 500 of them, you just ask us for that and we'll make them available."

Amazon is working to integrate M1 Mac mini units into its data center, with current plans targeting a go live date sometime in the first half of 2021, according to TechCrunch. Big Sur support is also in the works, though EC2 will be limited to macOS Mojave and Catalina at launch.

Mac instances are available On-Demand or with Savings Plans at a rate of $1.083 per hour. Supported regions include the U.S. East (N. Virginia), U.S. East (Ohio), U.S. West (Oregon), Europe (Ireland), and Asia Pacific (Singapore) with more to come.



20 Comments

pembroke 16 Years · 228 comments

I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?

Rayz2016 8 Years · 6957 comments

pembroke said:
I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 

Someone will hopefully correct me if I'm wrong, but I don't think so, no.

That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

I think @StrangeDays will know a bit more about this.

blastdoor 15 Years · 3594 comments

pembroke said:
I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 
Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?

I might be missing something, but I don’t think there are direct implications for end users. This sounds more relevant to developers who want/need to create apps for apple platforms but don’t want to buy and maintain their own Macs. Remember that to develop apps for Apple platforms, you need a Mac. But, crazy as it may sound, there do exist people who don’t want to own a Mac. 


I’ve been secretly hoping that Apple will provide an “iCloud Pro” service that provides a much better UI/UX experience for Apple platform users than is possible with AWS. That is, a seamless/transparent way for Apple pro users to access additional computational/storage resources in an Apple cloud. Apple silicon might make the economics of that work in a way that it never could with Intel, but we shall see. 

This AWS thing sounds like the bizarro world version of what I’ve been hoping for — a non-seamless way for non-Apple users to access Macs in the cloud. 

dewme 10 Years · 5775 comments

Rayz2016 said:
pembroke said:
I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 
Someone will hopefully correct me if I'm wrong, but I don't think so, no.

That would need some changes to the software you're running to handle the distributed processing. I imagine in the first instance, this is going to be used by folk who want to deploy microservices in scaling environments.

I think @StrangeDays will know a bit more about this.

Yes and no. The yes part: This is a platform as a service (PaaS) flavor of cloud computing where you purchase access to additional Mac machines (and supporting AWS infrastructure) for a period of time to do with whatever your needs dictate. Yes, if you have a big job or process pipeline to execute that would benefit from distributing it across multiple machines, you’ll need to have a way to divide the work up, parcel it out to multiple machines, and aggregate the results. There are several software development automation tools on the market that are explicitly designed for this purpose, e.g., Jenkins, XCode Server, and some development shops build their own automation.

That’s only one type of distributed processing use case, but there are many others especially in modern software development processes, like having a pool of dedicated compile machines, build machines, automated test machines, configuration management machines, deployment machines, etc., in a pipeline such that every time a block of code gets committed to a development branch from an individual developer it is compiled/built, integrated, unit tested, and regression tested against the entire software system that it is a part of. Each of the machines participating in the individual tasks along the pipeline feeds its output to the next machine in the process. These pipelines or processes run continuously and are event driven, but can also be scheduled. Thus the term continuous integration (CI) and continuous delivery (CD).

There is nothing preventing an individual user from purchasing access to a cloud based machine (Mac, Windows, Linux, etc.) to do whatever they want to do with it. The inhibitor tends to be the cost and terms of service, which are typically geared towards business users with business budgets. The huge benefit of PaaS cloud computing tends to be the ability to acquire many machines in very short order, and the ability to add/delete many machines nearly instantaneously as your needs change, i.e., elasticity. Try asking your IT department to spin up 125 new Macs overnight if they have to order physical machines from Apple or a distributor. They will laugh, you will cry.

The no part: If you need to deploy micro services you may not need complete, dedicated machines to deploy your micro services. You may just need a container to run your service, which could be a shared computing resource that is hosting several “tenants” running completely isolated in their own container. If you are building out the service hosting platform as part of your solution, then sure, you could have cloud based machines for your homegrown hosting platform, but this reverts to the previous use case that I mentioned.

I don’t get the “bizarro” comment from Blastdoor. This type of PaaS cloud computing has been in very widespread use for a very long time, with Amazon’s AWS and Microsoft’s Azure being two of the major players. You may want to see if AWS offers a test drive to get a feel for how it works. Microsoft used to and may still allow you to test drive their Azure PaaS solution. There’s nothing at all bizarre about how it works. You’re sitting in front of a Mac with full access to everything you can get to via the keyboard, mouse, and monitor. Instead of sitting under your desk it is sitting in the cloud.

Nothing bizarre looking here: https://aws.amazon.com/blogs/aws/new-use-mac-instances-to-build-test-macos-ios-ipados-tvos-and-watchos-apps/

My only concern with the Mac version of EC2 is that it relies on VNC for remoting the desktop. VNC is not as secure or as performant as RDP that is used for Windows remoting. Any way you cut it, Amazon providing support for Mac in AWS is a very big deal and brings Apple another step closer to competing at the enterprise level against Windows and Linux. 

bala1234 6 Years · 167 comments

pembroke said:
I’m not a developer and don’t really understand the implications for the end user. For example, if I have a complex video that requires heavy computing power to render, will it help me directly? Can I render across 10 machines rather than the one sitting in front of me to reduce the time to finish significantly? 
Is this something the software developers build into the software; like a switch users could trigger through clicking an option to allow access to a pipeline to cloud services which allow the spread of computations simultaneously across multiple cloud-based computers?

As others above have explained this is more geared towards developers than end user.  And in the first read it seems primarily targeted as a app development environment than a server solution. Although I would guess there nothing preventing developers from building a (distributed) multi machine video processing solution like you are asking for utilizing this.
Linux is much more popular & prevalent as a server solution as a result there much more tools and software available to build a solution "which allow the spread of computations simultaneously across multiple cloud-based computers"