Apple has pulled the social media app Parler from the App Store after the service didn't submit moderation guidelines — and Amazon is pulling AWS services from the social media network as well.
The takedown has removed the app from view in the App Store, with it no longer appearing in searches, following Apple's demand for change. New downloads of the app are no longer possible until the app is reinstated, though existing installations will still be able to access the service as normal.
Google pulled the app from the Google Play Store within hours of Apple's announcement, making the app unavailable to download to Android devices via that digital storefront.
On Friday, Apple contacted the developers behind Parler about complaints it received regarding content and its use, including how it was allegedly employed to "plan, coordinate, and facilitate the illegal activities in Washington D.C.," an email from the iPhone producer said. As well as enabling users to storm the U.S. Capitol, which led to the "loss of life, numerous injuries, and the destruction of property," Apple believed the app was continuing to be used to plan "yet further illegal and dangerous activities."
Apple gave Parler 24 hours to make changes to the app to more effectively moderate content posted by users, or face ejection from the App Store until the changes are actually implemented.
Shortly before 8 P.M. Eastern Time, almost an hour after the deadline, the app was removed from the App Store.
In a statement, Apple said "We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people's safety. We have suspended Parler from the App Store until they resolve these issues."
Parler bills itself as a "non-biased, free speech social media focused on protecting user's rights," and has become the online home for conservatives and radicals that have been kicked off other mainstream social networks like Facebook and Twitter. In recent months, the app had gained a reputation for being a safe-haven for conspiracy theorists and far-right extremists, including people who called for protests and violence after the latest U.S. presidential election.
While Parler believes it is a "neutral town square that just adheres to the law," as said by Parler CEO John Matze and quoted by Apple in the email, Apple insists Parler is "in fact responsible for all the user-generated content present on [the] service," and to make sure it meets the App Store requirements regarding user safety and protection. "We won't distribute apps that present dangerous or harmful content," wrote Apple to Parler.
Parler's CEO responded to the initial email by declaring standards applied to the app are not applied to other entities, including Apple itself. An earlier post from the CEO said "We will not save to pressure from anti-competitive actors! We will and have enforced our rules against violence and illegal activity. But we won't cave to politically motivated companies and those authoritarians who hate free speech!"
In a second email explaining the removal of Parler, Apple's App Review Board explains it had received a response from Parler's developers but had determined the measures as "inadequate to address the proliferation of dangerous and objectionable content on your app."
The decision was due to two reasons, with the primary problem being the insufficient moderation to "prevent the spread of dangerous and illegal content," including "direct threats of violence and calls to incite lawless action."
Apple also objects to Parler's mention of a moderation plan as "for the time being," which indicates any measures would be limited in duration rather than ongoing. Citing a need for "robust content moderation plans," Apple adds "A temporary 'task force' is not a sufficient response given the widespread proliferation of harmful content."
The threat from Apple occurred during a wider attempt by tech companies and social media services to cut access to accounts operated by activists, organizations, and political leaders who were linked to the Capital Hill attack. This includes President Donald Trump, who was suspended from both Twitter and Facebook for his inflammatory messaging to followers.
The full letter from Apple to Parler follows:
Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.
In your response, you referenced that Parler has been taking this content "very seriously for weeks." However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 - Safety - Objectionable Content.
Your response also references a moderation plan "for the time being," which does not meet the ongoing requirements in Guideline 1.2 - Safety - User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary "task force" is not a sufficient response given the widespread proliferation of harmful content.
For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.
Update January 9, 11:00 PM: Amazon has discontinued AWS service to Parler as well. It isn't clear if there is an alternate host for the service.
"Recently, we've seen a steady increase in this violent content on your website, all of which violates our terms," the email announcing the deadline from Amazon reads. "It's clear that Parler does not have an effective process to comply with the AWS terms of service."
Users on Parler have already threatened Tim Cook, Jeff Bezos, Apple Park, and "some AWS Data Centers" with violence.
The letter from Amazon to Parler obtained by BuzzFeed News reads as follows.
Thank you for speaking with us earlier today.As we discussed on the phone yesterday and this morning, we remain troubled by the repeated violations of our terms of service. Over the past several weeks, we've reported 98 examples to Parler of posts that clearly encourage and incite violence. Here are a few examples below from the ones we've sent previously.
Recently, we've seen a steady increase in this violent content on your website, all of which violates our terms. It's clear that Parler does not have an effective process to comply with the AWS terms of service. It also seems that Parler is still trying to determine its position on content moderation. You remove some violent content when contacted by us or others, but not always with urgency. Your CEO recently stated publicly that he doesn't "feel responsible for any of this, and neither should the platform."
This morning, you shared that you have a plan to more proactively moderate violent content, but plan to do so manually with volunteers. It's our view that this nascent plan to use volunteers to promptly identify and remove dangerous content will not work in light of the rapidly growing number of violent posts. This is further demonstrated by the fact that you still have not taken down much of the content that we've sent you. Given the unfortunate events that transpired this past week in Washington, D.C., there is serious risk that this type of content will further incite violence.
AWS provides technology and services to customers across the political spectrum, and we continue to respect Parler's right to determine for itself what content it will allow on its site. However, we cannot provide services to a customer that is unable to effectively identify and remove content that encourages or incites violence against others. Because Parler cannot comply with our terms of service and poses a very real risk to public safety, we plan to suspend Parler's account effective Sunday, January 10th, at 11:59PM PST. We will ensure that all of your data is preserved for you to migrate to your own servers, and will work with you as best as we can to help your migration.
- AWS Trust & Safety Team