This summer, Apple selected Maiden, N.C., as the site of its $1 billion server farm, and speculation suggested it was to support the company's booming media business, mostly through iTunes. But in an interview with Rich Miller, editor of Data Center Knowledge, Cult of Mac discovered that Apple could be looking to create an Internet-based computing operation with a size that would rival the services of Google.
The data center reportedly has 500,000 square feet of space for computers all inside one building, and Miller said that would make it one of the largest data centers in the world. Typically, he said, such large-scale operations are used by companies like Google for cloud computing. Apple's current data center in Newark, Calif., is just over 100,000 square feet.
Miller said that Apple likely chose the North Carolina location to save money, rather than for connectivity. Because the Mac-maker is more interested in cost and scale, he said it also suggests a cloud computing data center. Apple received a tax break from local lawmakers, with the assumption that the Cupertino, Calif.-based company can reach a $1 billion investment target within nine years. If the server farm remains active for three decades, the corporate tax breaks would amount to $300 million.
"In the past several years weâve seen a handful of new facilities that are redefining the scope of modern data centers," Miller told Cult of Mac. "These include Microsoftâs new facility in Chicago, the SuperNAP in Las Vegas and the Phoenix ONE co-location center in Phoenix. All of these facilities house at least 400,000 square feet of space. These data centers are designed to support an enormous volume of data, and reflect the acceleration of the transition to a digital economy. All those digital assets -â email, images, video and now virtual machines -â drive demand for more and larger data centers."
Apple already dabbles in cloud computing with its MobileMe Service, which delivers push e-mail, contacts and calendars from the Internet-based "cloud" to computers and handheld devices. It offers a suite of Web 2.0 applications that provide a desktop-like experience through a Web browser.
While Miller's cloud computing possibilities are speculation, as Apple has not announced its intent for the $1 billion server farm, it's also possible Apple is simply looking to bolster its current offerings. When MobileMe first launched in July of 2008, it was riddled with problems. As a result, Apple gave subscribers an extra 30 days of free service. MobileMe now comes with a 60-day free trial, while the cost for the service, with 20GB of online storage, is $99 per year.
51 Comments
and i thought it was going to be for dirt computing
Cloud Computing certainly seems to be the white elephant at the moment.
I can see it's benefits, but with an ADSL upload speed of about 256Kb, it's going to be a while before it's any use to me!
"Expert speculates Apple's new data center to be for cloud computing" -- Well, duh. Obviously it's not for their accounting department.
"Apple likely chose the North Carolina location to save money, rather than for connectivity."
Apple isn't daft. Rural North Carolina does give them lower business costs than a metropolitan area, but rural North Carolina isn't rural Montana. On the east coast, there are no remote areas, even if they are rural. Maiden NC is close to Interstate 40, which connects with I-81 in the west and I-95 in the east. Research Triangle Park is three hours to the east on I-40, and Charlotte is less than an hour to the south. Charlotte is the largest city in the Carolinas. Research Triangle Park has an impressive internet infrastructure, and Maiden itself sits on a line between Atlanta and Washington DC. This is a fabulous choice, because it is low cost and, contrary to the article, does not compromise 'connectivity' one byte.
And what else can it be for?
Too late, though. Big guys in the domain (MS, Google, etc.) compete already on such a tiny thing as closeness to power generation plants...
Cloud Computing certainly seems to be the white elephant at the moment.
I can see it's benefits, but with an ADSL upload speed of about 256Kb, it's going to be a while before it's any use to me!
It makes sense when you want to start lots of replications of the same 'thing' - where thing might be an x86 VM running a stack of software - an example might be, say, Linux, mySQL, WordPress. You get everything developed, testing and running locally, then deploy an image onto your virtual server farm.
Another would be running a Windows app remotely. Currently, most firms maintain their own Citrix servers to do this - you login and the Citrix server starts up an instance of Windows to run your app, transmitting the screen drawing commands in very low bandwidth form to your local/home PC or Mac. With the current Citrix client the difference between running a Windows app in Parallels and via Citrix on a Mac isn't that huge.
Anyway, some people are suggesting that a solution like this is probably the best way to deal with legacy software - Mac users do it with Parallels, Windows 7 users will have support for XP in a VM also (hardware willing).
So, a thought : how would you ensure users had the ability to run occasional x86 or Windows software on a different platform / CPU architecture. How many Windows licences would you need to provide 100 Mac users with the ability to remotely execute the occasional Windows app? It's an idea.
Personally, I think it's more likely to relate to a serious move into video delivery - i.e. if iTunes became to video what it is to music, they need a lot more depth of catalogue, and the ability to serve it - and personally, I think the holy grail in that market is going to be deep catalogue & streaming, rather than mainstream catalog and pay-to-own.