In September 2024, Telegram’s CEO Pavel Durov was arrested in Paris on charges of inaction against various forms of criminal enterprise.
Drug and human trafficking, fraud, and political extremism have become plentiful on Telegram, leading to governments imposing restrictions on the app. Telegram has a pitifully low number of moderators given the size of the app, with only 50 being full-time. Comparable apps by user count, like X (formerly Twitter), have around 1,500 moderators.
Telegram’s lack of moderators can be explained by cost-effectiveness or Durov’s commitment to free speech. Moderation is at the forefront of Durov’s mind, as he wants to ‘turn moderation from an area of criticism into one of praise’. If Telegram wants to keep Durov out of jail, it must hire more moderators to abate rising pressure from the EU and elsewhere.
Unmonitored social media has its perks and downsides
Telegram’s commitment to free speech came with genuine intentions, as Pavel and his brother Nicolai founded the app in response to Kremlin efforts to restrict freedom of expression in Russia. With almost a billion worldwide users, Telegram has thrived in authoritarian nations as a platform for independent journalism. Documentation of war crimes in Ukraine, Belarussian protests over the pandemic, and Iranian demonstrations in 2018 are all positive examples of Telegram’s uncensored stance.
However, unmonitored social media sites often become safe havens for crime and political extremism. Drug trafficking has been found en masse on Telegram, with other forms of crime being commonplace. Extremists used Telegram as a safe space to organise the January 6 Capitol Insurrection, and rioters used Telegram to coordinate events after the 2024 Southport stabbing.
Attempts to moderate Telegram have had limited success. After the October 7 2023, attacks in Israel, Telegram channels were used by Hamas leaders to spread images of the results. According to WIRED restrictions on these channels “reportedly didn’t do much restricting”, with the app continuing to be used by extremists on both sides.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataGovernments are losing patience with Telegram
Telegram is rapidly approaching the 45 million user threshold. As per the EU’s Digital Services Act (DSA), the company will soon have to disclose how its algorithm works, be transparent on how decisions are taken to moderate content, and show how advertisers target users.
The more concerning aspect of the DSA for Telegram brass is a condition whereby “as soon as anybody on the internet flags any content as potentially illegal, liability kicks in and would require the hosting company to “expeditiously” remove or disable access to the content in question.” Given the extent of criminal activity on Telegram, Durov will have to ramp up content moderation efforts in some way to avoid being liable for said crimes.
Durov’s arrest, its consequences, and the future for Telegram
The French authorities’ arrest of Durov is an unprecedented move and is an indication of the future direction of the EU social media moderation policy. Durov’s arrest comes as Telegram faces legal issues in South Korea and calls from various governmental bodies (like the EU and UK) for a crackdown on violent speech online.
The solution for Telegram is quite clear: hire more moderators. For an app the size of Telegram, 50 content moderators is simply too small a number. Hiring more moderators and enforcing its policy on crime would result in less pressure from the EU and would ensure user privacy for those who need it. Hiring more moderators would abate governmental scrutiny at least temporarily.
Many argue that content moderation could be a solid candidate for automation. Content moderation is one of the most mentally strenuous jobs on the planet, with multiple lawsuits being filed by moderators. Automation could aid workers by automatically banning hate speech or calls to violence. However, skilled people are necessary for issues like irony, nuance, or in-depth political takes. Telegram frequently claims that it has nothing to hide yet goes to great lengths to complicate government access to its servers and data even when presented with evidence of criminal activity. If Durov wants to make frequent trips to Europe using his French passport, he must moderate Telegram more thoroughly to abate governmental pressure.
Related Company Profiles
X Corp