What the Telegram founder’s arrest means for the regulation of social media firms | Technology


So we’ve entered a world in which the CEOs of major social network are arrested and detained. That’s quite a shift – and it didn’t come in a way anyone was expecting. From Jennifer Rankin in Brussels:

French judicial authorities on Sunday extended the detention of the Russian-born founder of Telegram, Pavel Durov, after his arrest at a Paris airport over alleged offences related to the messaging app.

When this phase of detention ends, the judge can decide to free him or press charges and remand in further custody.

French investigators had issued a warrant for Durov’s arrest as part of an inquiry into allegations of fraud, drug trafficking, organised crime, promotion of terrorism and cyberbullying.

Durov – who holds French citizenship alongside Emirati, Saint Kitts and Nevis, and Russian, the country of his birth – was arrested as he stepped off his private jet after returning from Azerbaijan’s capital, Baku. On Sunday evening, Telegram issued a statement:

⚖️ Telegram abides by EU laws, including the Digital Services Act – its moderation is within industry standards and constantly improving.

✈️ Telegram’s CEO Pavel Durov has nothing to hide and travels frequently in Europe.

😵‍💫 It is absurd to claim that a platform or its owner are responsible for abuse of that platform.

On Monday, French authorities said that Durov’s arrest was part of an investigation into cybercrime:

The Paris prosecutor, Laure Beccuau, said the investigation concerned crimes related to illicit transactions, child sexual abuse, fraud and the refusal to communicate information to authorities.

On the face of it, the arrest seems a sharp break from the norm. Governments have exchanged strong words with messaging platform providers before, but rarely have there been arrests. When platform operators do get arrested, as in the cases of Ross Ulbricht for Silk Road and Kim Dotcom for Megaupload, it tends to be because authorities can argue that the platform wouldn’t even exist were it not for crime.

Telegram has long operated as a low-moderation service, in part because of its roots as a chat app rather than a social network, in part because of Durov’s own experience in dealing with the Russian censors, and in part – many have alleged – because it is simply cheaper to have few moderators and less hands-on control of your platform.

But even if a soft-touch moderation team might open a company up to fines under laws like the UK’s Online Safety Act and the EU’s Digital Services Act, it’s rare for it to lead to personal charges – and rarer still for those charges to result in an executive being remanded in custody.

Encryption

But there is one quirk about Telegram that means it’s in a somewhat different position to peers such as WhatsApp and Signal: the service is not end-to-end encrypted.

WhatsApp, Signal and Apple’s iMessage are built from the ground up to prevent anyone other than the intended recipient from reading content shared on the services. That includes the companies that run the platforms – as well as any law enforcement that might request their help.

It’s caused no end of friction between some of the largest tech companies in the world and the governments that regulate them but, for the time being, the tech companies appear to have won the main fight. No one is seriously demanding end-to-end encryption be outlawed any more, with regulators and critics instead calling for approaches such as “client-side scanning” to try to police messaging services another way.

Telegram is different. The service does offer end-to-end encryption, through a little-used opt-in feature called “secret chats” but, by default, conversations are encrypted only insofar as they can’t be read by any random person connected to your wifi network. To Telegram itself, any messages sent outside a “secret chat” – which includes every group chat, and every message and comment on one of the service’s broadcast “channels” – is effectively in the clear.

That product decision marks Telegram out as distinct from its peers. But, oddly, the company’s marketing implies the distinction is almost exactly the opposite. Cryptography expert Matthew Green:

Telegram CEO Pavel Durov has continued to aggressively market Telegram as a “secure messenger.” Most recently he issued a scathing criticism of Signal and WhatsApp on his personal Telegram channel, implying that those systems were backdoored by the US government, and only Telegram’s independent encryption protocols were really trustworthy.

It no longer feels amusing to see the Telegram organization urge people away from default-encrypted messengers, while refusing to implement essential features that would widely encrypt their own users’ messages. In fact, it’s starting to feel a bit malicious.

Can’t v won’t

Paper planes placed outside the near French embassy in Moscow in support of Pavel Durov after his arrest in France. Photograph: Yulia Morozova/Reuters

The result of this mismatch between Telegram’s technology and its marketing is an unfortunate one. The company – and Durov personally – markets its app to people who are concerned that WhatsApp and even Signal, the gold standard of secure messengers, aren’t secure enough for their needs, and specifically aren’t secure enough against the US government.

At the same time, if a government comes knocking at Telegram’s door asking for information on a wrongdoer, real or perceived, Telegram doesn’t have the same safety that its peers do. An end-to-end encrypted service can sincerely tell law enforcement that it can’t help them. In the long run, that tends to create a fairly hostile atmosphere, but it also turns the conversation into a general one about principles of privacy versus policing.

Telegram, by contrast, has to pick. Either it helps law enforcement, or it ignores them, or it actively says it won’t cooperate. That’s no different from the options facing the vast majority of companies online, from Amazon to Zoopla, but only Telegram’s user base comprises people who want security against law enforcement.

Every time Telegram says “yes” to police, it pisses off that user base. Every time it says “no”, it plays a game of chicken with law enforcement.

The contours of the disagreement between France and Telegram will inevitably be crushed down into a conversation about “content moderation”, with supporters clustering accordingly (Elon Musk has already waded in, tweeting “#FreePavel”). But that conversation is normally about material posted in public: about what X or Facebook should or shouldn’t do to manage the discourse on their sites. Private and group messaging services are a fundamentally different offering, which is why the end-to-end encrypted mainstream services exist at all. But in trying to straddle both markets, Telegram may have lost the defences of either.

Last call for questions

My last day at the Guardian is rapidly approaching and next week’s email is being turned over to you, the readers. If there’s a question you’ve wanted to know the answer to, something that’s been niggling at the back of your mind for years, or if you’re just nosy about the inner workings of Techscape, hit reply to this email or contact me directly at alex.hern@theguardian.com. Ask me anything.

If you want to read the complete version of the newsletter please subscribe to receive TechScape in your inbox every Tuesday.



Source link

Leave a Comment