AppleInsider is supported by its audience and may earn commission as an Amazon Associate and affiliate partner on qualifying purchases. These affiliate partnerships do not influence our editorial content.
The "inappropriate content" that saw Telegram briefly disappear from the App Store last week was child pornography, Apple's marketing chief explained in response to a customer question.
"The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps," said Phil Schiller in an email seen by 9to5Mac. "After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children)."
Apple worked with the Telegram team to get the pornography removed and the people who posted it banned, Schiller continued. Telegram and Telegram X were only allowed back once this was completed and controls were put in place to prevent a repeat incident.
"We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral," Schiller added.
It's unclear what safeguards might have been put in place. Telegram is a mainly personal messaging app, though it does allow group chats with up to 30,000 people. One possibility is that Apple and Telegram took steps to block any groups that might be sharing child pornography.