• Deello@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      ·
      4 months ago

      I mean yes but that’s like saying Bitcoin is used by criminals to buy drugs and weapons. The problem is that’s not their only use.

      • stonerboner@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        4 months ago

        Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.

        Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.

        There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.

        There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.

    • pedroapero@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      4 months ago

      No, every service provider must remove infringing content when reported. That is not the case on Telegram.