Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Telegram was pulled because of child pornography, says Apple's Phil Schiller

The "inappropriate content" that saw Telegram briefly disappear from the App Store last week was child pornography, Apple's marketing chief explained in response to a customer question.

"The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps," said Phil Schiller in an email seen by 9to5Mac. "After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children)."

Apple worked with the Telegram team to get the pornography removed and the people who posted it banned, Schiller continued. Telegram and Telegram X were only allowed back once this was completed and controls were put in place to prevent a repeat incident.

"We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral," Schiller added.

It's unclear what safeguards might have been put in place. Telegram is a mainly personal messaging app, though it does allow group chats with up to 30,000 people. One possibility is that Apple and Telegram took steps to block any groups that might be sharing child pornography.



31 Comments

maestro64 19 Years · 5029 comments

okay, do not use telegram, and not 100% sure what it does, but how did Apple determine that users were posting stuff they should not have been, the app says it is encrypted communications. Does Apple have back door? also does Apple monitor its own messaging app for illegal content that user my be transmitting to one another? They obviously did not catch Anthony Wiener.

zer0her0 9 Years · 24 comments

It's mis-understood piece of the app, telegram has encrypted end-to-end messaging but that's not the default for one-to-one messages or group chats. So I'm guessing there was a group chat being used for this sort of thing. Telegram has moderated such illegal activity in the past (there was a brouhaha a couple years back about possibly ISIS using a group chat(s) for propaganda and recruitment that was reviewed by telegram team and removed), so not sure why Apple stepped in this time, specifically by pulling the app instead of reaching out to Telegram first. That being said. I applaud the effort to stop this as soon as possible.

racerhomie3 7 Years · 1264 comments

maestro64 said:
okay, do not use telegram, and not 100% sure what it does, but how did Apple determine that users were posting stuff they should not have been, the app says it is encrypted communications. Does Apple have back door? also does Apple monitor its own messaging app for illegal content that user my be transmitting to one another? They obviously did not catch Anthony Wiener.

lol. They used a 3rd party plugin. Those are not encrypted.

racerhomie3 7 Years · 1264 comments

For those folks wondering ,a 3rd party plugin was most likely used to spread the child porn.Those I do not think are encrypted.

boltsfan17 12 Years · 2294 comments

maestro64 said:
okay, do not use telegram, and not 100% sure what it does, but how did Apple determine that users were posting stuff they should not have been, the app says it is encrypted communications. Does Apple have back door? also does Apple monitor its own messaging app for illegal content that user my be transmitting to one another? They obviously did not catch Anthony Wiener.

It was a plug-in for Telegram that provided access to the illegal content.