Sunday, November 17, 2024

Telegram ignored outreach from child safety watchdogs before CEO’s arrest, groups say

Must read

Before Telegram’s CEO was arrested in France, the app had gained a reputation for ignoring advocacy groups fighting child exploitation.

Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored.

Pavel Durov, a co-founder and the CEO of Telegram, a messaging and news app that’s widely used in former Soviet countries and has become increasingly popular with the U.S. far right and groups banned from other platforms, remains in the custody of French authorities, who arrested him Saturday. 

The Paris prosecutor, who hasn’t announced charges, said Monday that Durov was arrested as part of an investigation into an unnamed person. The claims against the person include “complicity” in illegal transactions and possessing and distributing child sexual abuse material, the prosecutor said in a statement.

Telegram wrote in a statement on X that it abides by European Union laws. It said that Durov has “nothing to hide” and that it is “absurd to claim that a platform or its owner are responsible for abuse of that platform.”

Telegram has long branded itself as relatively unmoderated and unwilling to work with law enforcement. Durov said in April it had 900 million regular users.

John Shehan, senior vice president of NCMEC’s Exploited Children Division & International Engagement, said he was encouraged by France’s decision to arrest Durov because Telegram has been such a haven for CSAM.

“Telegram is truly in a league of their own as far as their lack of content moderation or even interest in preventing child sexual exploitation activity on their platform,” he said

“It is encouraging to see the French government, French police, taking some action to potentially rectify this type of activity,” Shehan said.

Telegram’s website says it never responds to any reports of any kind of illegal activity in private or group chats, “even if reported by a user.” It also says that unlike other major tech platforms, which routinely comply with court orders and warrants for user data, “we have disclosed 0 bytes of user data to third parties, including governments.”

NBC News asked Telegram to comment on the groups’ claims that their efforts to flag CSAM have been ignored. In a statement, Telegram spokesperson Remi Vaughan didn’t address their comments but said the platform “actively moderates harmful content on its platform including child abuse materials.”

“Moderators use a combination of proactive monitoring of public parts of the platform, AI tools, and user reports to remove content that breaches Telegram’s terms of service,” Vaughan said. Telegram maintains a channel that gives daily updates on how many groups and channels have been reported for child abuse, and it claims thousands of public groups are banned daily.

In a report last year on platforms’ enforcement of CSAM, the Stanford Internet Observatory noted that while Telegram says it’s against its rules to share CSAM in public channels, it is the only major tech platform whose privacy policy doesn’t explicitly prohibit CSAM or grooming of children in its private chats.

By law, U.S.-based platforms are required to work with NCMEC, which runs the world’s largest international coordination center among law enforcement, social media platforms and tipsters to flag confirmed abuse material so it can be taken down rapidly. Telegram is based in Dubai in the United Arab Emirates, which Durov, who was born in the former Soviet Union, has claimed is a neutral country that doesn’t make his platform beholden to any government.

But major tech companies outside the U.S., including TikTok, which is owned by the Chinese company ByteDance; U.K.-based Fenix, which owns OnlyFans; and the Canadian conglomerate Aylo, which owns Pornhub, all remove CSAM that NCMEC flags, Shehan said.

Telegram offers what it describes as an option to encrypt private messages end to end, meaning only users, not the platform, can read them. But while other end-to-end encrypted messaging services, like WhatsApp, allow users to report and forward illegal content, Telegram doesn’t have such an option.

NCMEC has received 570,000 reports of CSAM on Telegram in total, Shehan said. The app was launched in 2013.

“They’ve been really, really clear on the team that they have no interest. We sporadically reach out, but it’s not frequent anymore,” he said. “They don’t respond at all.”

A spokesperson for the U.K.’s Internet Watch Foundation, an independent nonprofit organization that works to curb the spread of CSAM, said that it had made repeated attempts to work with Telegram over the past year but that Telegram has refused to “take of any of its services to block, prevent and disrupt the sharing of child sexual abuse imagery.”

“There’s no excuse,” said the group’s deputy CEO, Heidi Kempster. “All platforms have it within their gift to do something, now, to prevent the spread of child sexual abuse imagery. We have the tools, we have the data, and any failure to stop this known content from proliferating is an active and deliberate choice.”

Stephen Sauer, who directs Canada’s national CSAM tip line at the Canadian Centre for Child Protection, said in an emailed statement that not only has Telegram ignored its attempts to flag CSAM, but also that abuse material there has become more prevalent.

“Based on our observations, Telegram’s platform is increasingly being used to make CSAM available to offenders. In many cases, we see Telegram links or accounts advertised on web forums and even on U.S.-based mainstream social media platforms that act as a funnel to drive traffic to illegal Telegram-based content,” he said.

“Telegram’s moderation practices are completely opaque — we really have no sense of how they operate. Likewise, we receive no confirmation or feedback from the company on the moderation outcome when we do report content into them. More importantly, it does not appear the platform itself is taking adequate proactive steps to curb the spread of CSAM on their service despite its known use for facilitating the exchange of this type of material,” Sauer said.

Latest article