• Minotaur@lemm.ee
    link
    fedilink
    English
    arrow-up
    151
    arrow-down
    31
    ·
    9 months ago

    I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

    It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

    Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

    It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

    • Zak@lemmy.world
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      1
      ·
      9 months ago

      I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn’t responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

      With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don’t like this case. I especially don’t like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

      • rambaroo@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        9 months ago

        Reddit is the same thing. They intentionally enable and cultivate hostility and bullying there to drive up engagement.

      • deweydecibel@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        6
        ·
        9 months ago

        Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

        The problem then becomes if the clearly defined rules aren’t enough, then the people that run these sites need to start making individual judgment calls based on…well, their gut, really. And that creates a lot of issues if the site in question could be held accountable for making a poor call or overlooking something.

        The threat of legal repercussions hanging over them is going to make them default to the most strict actions, and that’s kind of a problem if there isn’t a clear definition of what things need to be actioned against.

        • rambaroo@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          9 months ago

          Bullshit. There’s no slippery slope here. You act like these social media companies just stumbled onto algorithms. They didn’t, they designed these intentionally to drive engagement up.

          Demanding that they change their algorithms to stop intentionally driving negativity and extremism isn’t dystopian at all, and it’s very frustrating that you think it is. If you choose to do nothing about this issue I promise you we’ll be living in a fascist nation within 10 years, and it won’t be an accident.

        • VirtualOdour@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          5
          ·
          9 months ago

          It’s the chilling effect they use in China, don’t make it clear what will get you in trouble and then people are too scared to say anything

          Just another group looking to control expression by the back door

          • rambaroo@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            edit-2
            9 months ago

            There’s nothing ambiguous about this. Give me a break. We’re demanding that social media companies stop deliberately driving negativity and extremism to get clicks. This has fuck all to do with free speech. What they’re doing isn’t “free speech”, it’s mass manipulation, and it’s very deliberate. And it isn’t disclosed to users at any point, which also makes it fraudulent.

            It’s incredibly ironic that you’re accusing people of an effort to control expression when that’s literally what social media has been doing since the beginning. They’re the ones trying to turn the world into a dystopia, not the other way around.

    • Arbiter@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      9 months ago

      Yeah, but algorithmic delivery of radicalizing content seems kinda evil though.

    • rambaroo@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      4
      ·
      edit-2
      9 months ago

      I don’t think you understand the issue. I’m very disappointed to see that this is the top comment. This wasn’t an accident. These social media companies deliberately feed people the most upsetting and extreme material they can. They’re intentionally radicalizing people to make money from engagement.

      They’re absolutely responsible for what they’ve done, and it isn’t “by proxy”, it’s extremely direct and deliberate. It’s long past time that courts held them liable. What they’re doing is criminal.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        9 months ago

        I do. I just very much understand the extent that the justice system will take decisions like this and utilize them to accuse any person or business (including you!) of a crime that they can then “prove” they were at fault for.

    • WarlordSdocy@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      9 months ago

      I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won’t stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.

      • rambaroo@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        9 months ago

        I agree, but I want to clarify. It’s not about making this material harder to access. It’s about not deliberately serving that material to people who weren’t looking it up in the first place in order to get more clicks.

        There’s a huge difference between a user looking up extreme content on purpose and social media serving extreme content to unsuspecting people because the company knows it will upset them.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      5
      ·
      9 months ago

      Do you not think if someone encouraged a murderer they should be held accountable? It’s not everyone they interacted with, there has to be reasonable suspicion they contributed.

      Also I’m pretty sure this is nothing new

      • deweydecibel@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        9 months ago

        Depends on what you mean by “encouraged”. That is going to need a very precise definition in these cases.

        And the point isn’t that people shouldn’t be held accountable, it’s that there are a lot of gray areas here, we need to be careful how we navigate them. Irresponsible rulings or poorly implemented laws can destabilize everything that makes the internet worthwhile.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        9 months ago

        I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.

        I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.

        • morrowind@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          I must have misunderstood you then, but this still seems like a pretty clear case where the platforms, not even people yet did encourage him. I don’t think there’s any new precedent being set here

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            Rulings often start at the corporation / large major entity level and work their way down to the individual. Think piracy laws. At first, only giant, clear bootlegging operations were really prosecuted for that, and then people torrenting content for profit, and then people torrenting large amounts of content for free - and now we currently exist in an environment where you can torrent a movie or whatever and probably be fine, but also if the criminal justice system wants to they can (and have) easily hit anyone who does with a charge for tens of thousands of dollars or years of jail time.

            Will it happen to the vast majority of people who torrent media casually? No. But we currently exist in an environment where if you get unlucky enough or someone wants to punish you for it enough, you can essentially have this massive sentence handed down to you almost “at random”.

        • Ð Greıt Þu̇mpkin@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          9 months ago

          Is there currently a national crisis of Jacobins kidnapping oligarchs and beheading them in public I am unaware of?

        • rambaroo@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          4
          ·
          9 months ago

          Literally no one suggested that end users should be arrested for jokes on the internet. Fuck off with your attempts at trying to distract from the real issue.

    • Socsa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      9 months ago

      This wasn’t just a content issue. Reddit actively banned people for reporting violent content too much. They literally engaged with and protected these communities, even as people yelled that they were going to get someone hurt.

    • deweydecibel@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      9 months ago

      Also worth remembering, this opens up avenues for lawsuits on other types of “harm”.

      We have states that have outlawed abortion. What do those sites do when those states argue social media should be “held accountable” for all the women who are provided information on abortion access through YouTube, Facebook, reddit, etc?

    • Ð Greıt Þu̇mpkin@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      9 months ago

      I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.

      Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.

      • solrize@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        9 months ago

        This guy seems to have bought the gun legally at a gun store, after filling out the forms and passing the background check. You may be thinking of the guy in Maine whose parents bought him a gun when he was obviously dangerous. They were just convicted of involuntary manslaughter for that, iirc.

          • solrize@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            edit-2
            9 months ago

            Well you were talking about charging the gun owner if someone else commits a crime with their gun. That’s unrelated to this case where the shooter was the gun owner.

            The lawsuit here is about radicalization but if we’re pursuing companies who do that, I’d start with Fox News.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?

        • jkrtn@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Oh, it turns out an extension cord has a side use that isn’t related to its primary purpose. What’s the analogous innocuous use of a semiautomatic handgun?

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            Self defense? You don’t have to be a 2A diehard to understand that it’s still a legal object. What’s the “innocuous use” of a VPN? Or a torrenting client? Should we imprison everyone who ever sends a link about one of these to someone who seems interested in their use?

            • jkrtn@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              You’re deliberately ignoring the point that the primary use of a semiautomatic pistol is killing people, whether self-defense or mass murder.

              Should you be culpable for giving your brother an extension cord if he lies that it is for the porch? Not really.

              Should you be culpable for giving your brother a gun if he lies that he needs it for self defense? IDK the answer, but it’s absolutely not equivalent.

              It is a higher level of responsibility, you know lives are in danger if you give them a tool for killing. I don’t think it’s unreasonable if there is a higher standard for loaning it out or leaving it unsecured.

              • Minotaur@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                9 months ago

                “Sorry bro. I’d love to go target shooting with you, but you started taking Vynase 6 months ago and I’m worried if you blow your brains out the state will throw me in prison for 15 years”.

                Besides, youre ignoring the point. This article isn’t about a gun, it’s about basically “this person saw content we didn’t make on our website”. You think that wont be extended to general content sent from a person to another? That if you send some pro-Palestine articles to your buddy and then a year or two later your buddy gets busted at an anti-Zionist rally and now you’re a felon because you enabled that? Boy, that would be an easy way for some hypothetical future administrations to control speech!!

                You might live in a very nice bubble, but not everyone will.

                • jkrtn@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 months ago

                  So you need a strawman argument transitioning from loaning a weapon unsupervised to someone we know is depressed. Now it is just target shooting with them, so distancing the loan aspect and adding a presumption of using the item together.

                  This is a side discussion. You are the one who decided to write strawman arguments relating guns to extension cords, so I thought it was reasonable to respond to that. It seems like you’re upset that your argument doesn’t make sense under closer inspection and you want to pull the ejection lever to escape. Okay, it’s done.

                  The article is about a civil lawsuit, nobody is going to jail. Nobody is going to be able to take a precedent and sue me, an individual, over sharing articles to friends and family, because the algorithm is a key part of the argument.

        • rambaroo@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          9 months ago

          Knowingly manipulating people into suicide is a crime and people have already been found guilty of doing it.

          So the answer is obvious. If you knowingly encourage a vulnerable person to commit suicide, and your intent can be proved, you can and should be held accountable for manslaughter.

          That’s what social media companies are doing. They aren’t loaning you extremist ideas to help you. That’s a terrible analogy. They’re intentionally serving extreme content to drive you into more and more upsetting spaces, while pretending that there aren’t any consequences for doing so.

        • Ð Greıt Þu̇mpkin@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          9 months ago

          Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?

          I’m more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I’m running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            …. Aside from everything else, are you under the impression that a 10-15 ft extension cord is an odd thing to own…?

    • jumjummy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      And ironically the gun manufacturers or politicians who support lax gun laws are not included in these “nets”. A radicalized individual with a butcher knife can’t possibly do as much damage as one with a gun.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    3
    ·
    9 months ago

    Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

  • PorkSoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    1
    ·
    edit-2
    9 months ago

    Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

    For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

    My point is these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.

    • Fedizen@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      9 months ago

      I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

      • Corhen@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        thats why i always use youtube by subscribed first, then only delve into regular front page if theres nothing interesting in my subscriptions

  • Krudler@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    edit-2
    9 months ago

    I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.

    I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

    I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

    Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s

    The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

    Edit: Oh and the hilarious part that many people won’t let go (when shown this) is that it says it’s based on my activity in the Drunk reddit which I had never once been to, commented in, posted in, or was even aware of. So that just makes it worse.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 months ago

      Its not reddit if posts don’t get nuked or shadowbanned by literal sitewide admins

      • Krudler@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 months ago

        Yes I was advised in the removal notice that it had been removed by the Reddit Administrators so that they could keep Reddit “safe”.

        I guess their idea of “safe” isn’t 4+ million users going into their privacy panel and turning off exploitative sub recommendations.

        Idk though I’m just a humble bird lawyer.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      9 months ago

      Yeah this happens a lot more than people think. I used to work at a hotel, and when the large sobriety group got together yearly, they changed bar hours from the normal hours, to as close to 24/7 as they could legally get. They also raised the prices on alcohol.

  • The_Tired_Horizon@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    9 months ago

    I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you’ve be investigated. Twitter was also bad for responding to reports with “this doesnt break our rules” when a) it clearly did and b) probably a few laws.

    • Alien Nathan Edward@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      9 months ago

      I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo’s album “I Am Not Afraid Of You And I Will Beat Your Ass” was worthy of a 30 day ban

      • Panda (he/him)@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        I remember one time somebody tweeted asking what the third track off Whole Lotta Red and I watched at least 50 people get perma’d before my eyes.

        The third track is named Stop Breathing.

      • The_Tired_Horizon@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        On youtube I had a persistent one who only stopped threatening to track me down and kill me (for a road safety video) when I posted the address of a local police station and said “pop in, any time!”

        • otp@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          9 months ago

          Americans online regularly tell me that that’s protected free speech down there! Haha

  • Scott@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    6
    ·
    9 months ago

    Excuse me what in the Kentucky fried fuck?

    As much as everyone says fuck these big guys all day this hurts everyone.

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    5
    ·
    9 months ago

    media: Video games cause violence

    media: Weird music causes violence.

    media: Social media could never cause violence this is censorship (also we don’t want to pay moderators)

    • Eximius@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      9 months ago

      Since media (that you define by the trophes of unsubtantiated news outlets) couldnt sensibly refer to a forum like reddit or even facebook, this makes no sense.

  • Melllvar@startrek.website
    link
    fedilink
    English
    arrow-up
    13
    ·
    9 months ago

    I think there’s definitely a case to be made that recommendation algorithms, etc. constitute editorial control and thus the platform may not be immune to lawsuits based on user posts.

  • casual_turtle_stew_enjoyer@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    9 months ago

    I will testify under oath with evidence that Reddit, the company, has not only turned a blind eye to but also encouraged and intentfully enabled radicalization on their platform. It is the entire reason I am on Lemmy. It is the entire reason for my username. It is the reason I questioned my allyship with certain marginalized communities. It is the reason I tense up at the mention of turtles.

  • TropicalDingdong@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    9 months ago

    I don’t understand how a social media company can face liability in this circumstance but a weapons manufacturer doesn’t.

    • gum_dragon@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      Or individuals who repeatedly spreading the specific hateful ideology that radicalize people and also encourages them to act on it

  • Kalysta@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    5
    ·
    9 months ago

    Love Reddit’s lies about them taking down hateful content when they’re 100% behind Israel’s genocide of the Palestinians and will ban you if you say anything remotely negative about Israel’s govenment. And the amount of transphobia on the site is disgusting. Let alone the misogyny.

    • captainlezbian@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      9 months ago

      Lol, yeah I moderated major trans subreddits for years. It was entirely hit and miss if we’d get support from the admins