back to article Messaging app makers' dilemma: Keeping comms private and funding open source

Not upsetting law enforcement with end to end encryption and finding a sustainable way to fund open source development are challenges facing messaging giants and minnows alike. Telegram cleanup? Announcing that the platform was removing content from those who "violated our terms of service," Telegram chief Durov said on his …

  1. Tron Silver badge

    You can't do this stuff commercially any more.

    Western governments are switching to the Chinese model and will expect a back door in absolutely everything.

    Whether messaging or social media, the government will simply lock you up for 'enabling', just as they lock up Royal Mail CEOs every time a letter bomb goes off. No, wait...

    The core software will have to be a free product released anonymously and widely circulated (on physical media). You may be able to sell products that plug in to its documented architecture to monetise it, with adverts etc, but the state, looking ahead, will expect access to absolutely everything anyone does online. They are using Orwell's 'Nineteen Eighty-Four' as a manual.

    As with Brexit, there will always be enough gullible people to join the mob and demand whatever the government want them too. One child with hurt feelings is one too many. Vote now for universal surveillance, censorship and a ban on encryption to end this madness! And they will. It's taken the Empire this long to strike back because there isn't a lot of spare IQ points in government. But they got there in the end.

    The same thing happened in the UK in 1737. Walpole's corrupt, incompetent regime were ridiculed nightly on the stage. They brought in state licensing of drama. The censorship lasted until 1968.

    Western governments are as good at censorship as the Chinese, but they have to work out how to do it on the sly, so they can continue to pretend to be morally superior to dictatorships.

    1. Dan 55 Silver badge

      Re: You can't do this stuff commercially any more.

      Telegram is not encrypted in any meaningful sense. If you buy a new device and log onto Telegram then you get all your channels, groups, and 1-1 chat history back on the new device. If Telegram has all the data and is served with a warrant, it's got to hand over all the data. This is not a back door.

      The only thing you don't get back on a new device is E2E encrypted chats which is buried in the UI and requires one of the parties in the group or 1-1 chat to say something like, "Can we switch over to E2E encrypted chats" which is an obvious red flag and will make the police or courts serve a warrant for one or both devices if they have been identified.

      The reason why Telegram has become a den of vice and iniquity is not due to encryption which its users probably aren't using anyway, it's due to its policy of ignoring requests from police or courts. France has forced them to join the real world, at least for a short time.

    2. David 164

      Re: You can't do this stuff commercially any more.

      Actually that the US model, which is essentially Five Eyes model, they have been demanding backdoors into everything for decades.

    3. This post has been deleted by its author

  2. Tron Silver badge

    PS.

    We really need to start producing distributed software on a distributed internet.

  3. Anonymous Coward
    Anonymous Coward

    "Telegram chief Durov said on his own channel this week that "We won't let bad actors jeopardize the integrity of our platform for almost a billion users.""

    Well that's Steven Seagal screwed then...

  4. Adair Silver badge

    Personally...

    I've always been happy to pay a nominal subscription rate to services I value.

    Ideally (maybe it exists - please tell me) I would use a hub payment service where I can nominate a share of my monthly subscription to the services I support.

    Instead there only seems to be very random and unwieldy ways to pay small amounts to a range of different, and changing, software/service providers. This doesn't encourage support.

    1. Phil Koenig Bronze badge

      Subscription fees

      "Micropayments" was a new and upcoming hot topic decades ago in the online community. A way to make it easier to fund online businesses and resources without having to pay a giant one-time fee for something one rarely uses.

      But the corporate marketroids eventually convinced the world of e-commerce to propagate a childish lie instead: that everything online was actually "frea"!

      And concurrent with that, monetize every click, glance, email and life-event by blanketing the online (and offline too, as it turns out) world with a vast panoply of online and offline trackers using obfuscated javascript from internet domains hidden behind "privacy shields" (originally meant to protect individual domain-owners from the likes of such zealous marketeers) was a more lucrative approach.

      And it was.

      May the last shred of personal privacy rest in peace.

  5. Anonymous Coward
    Anonymous Coward

    Ironically, I don't have much problem with the police having warrant access to my messages.

    What I don't like is companies using my chats to increase my insurance premiums or to decide they can charge me more for services, or selling anything and everything to Palantir et al, so that the police don't need to even bother with a warrant.

    The solution is simply to have privacy law that makes it completely illegal to do anything with my messages, except deliver them.

  6. StrangerHereMyself Silver badge

    Plausible deniablity

    The use of E2EE allows chat application makers to claim "plausible deniability" since they don't know what's being communicated on their platforms. This is why Apple and WhatsApp (Meta) have gone that route, to be rid of pressure groups always demanding they "do more" against CSAM, grooming and sexting.

    I predict Telegram will now go full-Monty with E2EE. First order of business is to get Durov out of jail and they're attempting that by making all sorts of concessions, probably ratting out everyone LEA points their fingers at.

    1. Phil Koenig Bronze badge

      Re: Plausible deniablity

      StrangerHereMyself wrote:

      This is why Apple and WhatsApp (Meta) have gone that route, to be rid of pressure groups always demanding they "do more" against CSAM, grooming and sexting.

      Except Apple did nothing of the sort.

      Yes, they implemented E2EE in some parts of their platform, while simultaneously adding what amounts to back-doors for law enforcement.

      First is iCloud. While various parts of iCloud can theoretically be E2EE encrypted, your iCloud backups are not.

      And since iCloud backups are enabled by default, and most people (especially Apple users) don't diddle around with the defaults, when Big Brother comes 'a knockin', all they have to do is retrieve one of those nice iCloud backups of yours (which Apple retains the decryption key to) and hand it over to the authorities to peruse. Neat.

      Furthermore, back in 2021 Apple proposed and prepared to deploy - to much outcry amongst various tech and digital privacy advocates - what probably amounts to the single most intrusive mass personal-data snooping regime ever proposed for a personal digital platform.

      This diabolical plan would silently and continuously scan every user's ENTIRE device storage for media files (and probably various other types of files) and make unique digital hashes ("fingerprints") of every file the user had saved on their device. Then, Apple would accept submissions from various "advocacy groups" (claiming to be devoted to "saving the children" or whatever other heartstring-pulling mission label which would most effectively disable people's logical power of discrimination) of large lists of such "file hashes" which supposedly matched illegal content discovered in the wild.

      And if a "match was found", Apple would then work with the relevant advocacy group and law enforcement to investigate the owner of the device upon which one or more "matches" were found. Across Apple's entire userbase.

      Needless to say, abuse of such an "Eye of Sauron" style mass public surveillance mechanism would be relatively trivial.

      Orwell could not have imagined a more dystopian plan.

      Luckily, the global outcry over this proposal was so heated that Apple was forced to first step back, and ultimately back off their plans to deploy this system, in 2023.

      But its reputation for even proposing such a scheme will stay with them for years to come.

      .

      Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy (WIRED, 2023-09)

      Apple CSAM controversy continues: Charity says company under-reporting (9 to 5 Mac, 2024-07)

      1. StrangerHereMyself Silver badge

        Re: Plausible deniablity

        They made this Orwellian concession because they were under constant pressure to do more against CSAM from organizations out to save the children. I guess they panicked and some dull knife in the drawer came up with this plan without ever thinking about its implications.

        Yes, it hurt their reputation but not fatally so.

        BTW iCloud is E2EE and so is iMessage. Notice that since they implemented this the vocal calls from sexual abuse organizations have been silenced.

  7. Arthur Daily

    1984 Orwell is coming to you

    Telegram was never secure. The problem with rules and asks is they are not visible. Illegal and or criminal is confused - PC correctness, People taking offence, Parody's or just plain Presidential argy bargy suppressing the opposition. And yes, the UK is an offender with overreach, and the USA is working on qualifying the 1st amendment, and getting web newspaper tiktok stuff banned. The young generation will work it out, and via VPN's and private on top of ISP encryption. Signal is open source, and frequent key rotations on top of a paranoid design are all open source and totally robust. At least until some hardware ME or bad OS decides to exfiltrate private keys secretly. There is no secure phone either. The upshot is the new young generation is no longer swallowing political BS, and the politicians do not like it. Most ISP's since 1990, freely ratted out the worse of the worst, when the vomit nerve was triggered. Here come the Stasi - again. 1939 has been forgotten.

  8. GeekyOldFart

    I've been using encryption and digital signatures since early alphas of PGP. I'd prefer all messaging traffic to use E2EE by default, with only myself and the recipient(s) able to read the traffic.

    There is nothing to stop the cops turning up at my doorstep with a warrant for my private keys, which wouldn't bother me overmuch as I'm a pretty law-abiding guy but the point there is that it requires cause for such a warrant to be issued. It can be challenged in the courts. I can, should I wish to, have MY lawyer breathing down their necks as they use those keys, making sure they are only using them to look at stuff covered by that warrant.

    Back doors and technical channels into services I use do not have that safeguard. You execute a warrant, it's public info. Pull a data dump from a provider without one, not so much. Quis custodiet ipsos custodes?

  9. Catkin Silver badge

    Paradox of security

    -criminals use E2EE to facilitate crimes

    -E2EE cannot be completely banned without seriously limiting the Internet

    -not having E2EE puts me at greater risk from criminals

    -politicians admit they cannot entirely combat crime if criminals use E2EE

    Therefore, banning E2EE will not make me safer. If it is possible to backdoor encrypted communications but keep it safe from criminals, the politicians proposing such measures can demonstrate it for a few years by using it themselves for their most sensitive communications. In fact, go one better: they can put cameras around their homes (even in the bedroom and bathroom) with the stream entirely accessible to anyone but secured with a key held under the same conditions they propose for our escrowed keys.

    1. Anonymous Coward
      Anonymous Coward

      Re: Paradox of security

      >Therefore, banning E2EE will not make me safer.

      What you have written is plausible sounding, but based on nothing.

      Whether you are better or worse off without E2EE depends on the sum, for each particular case, of the (probability*cost) of each of the myriad ways that you can be harmed by criminal acts that might be thwarted by the police or by criminal acts that might be thwarted by E2EE.

      Without the numbers, the conclusion is based on nothing at all, other than the prevalent ideas that limiting encryption is a)impossible and b)bad things happen c)If E2EE is banned for one arena it will somehow become unavailable in another

      For example it is commonly suggested that banning E2EE will put banking at risk.

      This is bilge as there is no connection between forcing Whatsapp to allow police to read messages, and prohibiting banking apps from using E2EE - which no one has ever suggested.

      This is not to say that banning E2EE is a good or a bad idea, just that your assertions about it are baseless. Obviously baseless assertions are not going to convince any government that an idea is a poor one, in fact it is generally an indication that the idea is OK, because if it wasn't the opponents would bring up strong arguments, not baseless ones.

      1. Phil Koenig Bronze badge

        Re: Paradox of security

        Cowardly Poster wrote:

        What you have written is plausible sounding, but based on nothing.

        The idea that it makes us all safer to trust the government to snoop on anything they want to, whenever they want to, is a well-known fallacy.

        In effect, banning strong e2ee is asking for exactly that.

        Asking for some sort of government "backdoor" to any form of strong encryption that can be deployed by citizens is the same.

        Here in the USA I remember well the mid-1990s controversy over the USGOV's proposed "Clipper Chip" which would have required device makers to use a government-developed encryption chip that included a "backdoor" that supposedly allowed only the government to access communications that would be impossible for anyone else to decrypt. This technology was not publicly documented (it was actually classified Secret), the rationale given is to prevent "bad actors" from learning how to bypass it.

        Sounds legit, right? Well not to a cryptographer.

        Cryptographers and other technology experts explained at the time why this type of system created a false sense of security. Among other things, in the cryptography field the last thing you want is to prevent an encryption scheme or protocol from being inspected and tested by the public for flaws before relying on it for sensitive data.

        So in addition to the logical and practical flaws in such a "nanny system", once USGOV backed-down on this poorly-advised scheme and the technology was later de-classified, researchers corroborated their suspicions and found a bunch of technical flaws with it, including tactics that could completely bypass the central "key escrow" function.

        Outrage at the Clipper Chip proposal was literally what led people like Phil Zimmerman to create PGP, and several MIT cryptographers to invent public-key cryptography.

        30 years later, the world has not fallen apart and been taken over by criminals empowered by encryption. In the real world, the key to getting the upper-hand against the bad guys is often a lot simpler than the spooks would like you to believe:

        xkcd: $5 wrench

        Crypto: How the Code Rebels Beat the Government – Saving Privacy in the Digital Age (book review)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like