Apple says it temporarily removed messaging app Telegram from its App Store because it was being used to distribute child pornography.
The Telegram and Telegram X apps were pulled last Thursday due to “inappropriate content.” At the time, Telegram did not elaborate, but 9to5Mac reports that the apps were removed because images depicting child sex abuse were being distributed over the service.
Following fixes and updates, normal service was restored, with both the Telegram apps available for download less than 24 hours after they were removed.
Apple’s SVP of marketing, Phil Schiller, confirmed the problem in an email sent to users, which was obtained by 9to5Mac.
“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children),” he wrote.
“The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.”
It’s unclear if any arrests have been made in connection with the incident.
It’s also unknown exactly how the illegal imagery was shared. Telegram, like WhatsApp, boasts end-to-end encryption, meaning that it was unlikely that users were directly sharing the images. It’s instead thought that third-party plug-ins were used to share the content, something Telegram CEO Pavel Durov has said comes with an element of risk.
The NCMEC’s John Shehan said that any users who inadvertently find child abuse images on Telegram or any other service or site should report it immediately.
“We provide CyberTipline reports to law enforcement in the United States and more than 100 countries. If a user comes across an image depicting the exploitation of a child we ask that they do not reshare/repost the image. They should report the image to the social media platform as well as to the NCMEC CyberTipline.”
The UK’s Internet Watch Foundation, which operates a service similar to NCMEC’s Cybertipline, noted in 2014 that instances of legitimate and legal sites hacked to act as “redirectors” to illegal sexual abuse imagery was on the rise. This may well be another example of that trend, if any otherwise innocuous and legitimate plug-ins were used to distribute the images.