Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

  • TropicalDingdong@lemmy.world
    link
    fedilink
    English
    arrow-up
    107
    arrow-down
    2
    ·
    1 year ago

    Bro people were eating tidepods and we saw a resurgence of nazism and white nationlism.

    I think we at least know the effects of what was happening before.

      • CALIGVLA@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        1 year ago

        Just get 4chan to convince the imbeciles that it’s a white supremacist symbol like they did with the okay sign.

        • Duamerthrax@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          1 year ago

          You got that turned around. 4chan convinced politicians/pundits the ok symbol was white supremacist. Honestly, it worked, but they should have picked the shocker. Would have actually been funny.

          • CALIGVLA@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            1 year ago

            4chan convinced politicians/pundits the ok symbol was white supremacist.

            Exactly, the imbeciles. Which is even more funny considering the original intention was to convince liberals and other lefties that the ok symbol was racist just for the hell of it, but it backfired and only braindead nazis and fascists ended up believing it.

            • Duamerthrax@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              Maybe it was a response to “Trap” suddenly becoming transphobic? That’s not a hill that’s worth dying on, but it seems like overnight that term became problematic.

              Anyway, neo-nazis love hidden symbols. Even if they aren’t hidden at all.

        • Cornelius_Wangenheim@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          1 year ago

          If a bunch of “ironic” racists start using a symbol as a “joke” and one of them flashes it after murdering 50 people because of their religion, then it’s officially a hate symbol.

      • tmsqhazdzp@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        …something something… making your whites whiter… they’ll get the message they’re after

  • Duamerthrax@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    ·
    1 year ago

    Weird. Youtube keeps recommending right wing videos even though I’ve purged them from my watch history and always selected Not Interested. It got to the point that I installed a 3rd party channel blocker.

    I don’t even watch too many left leaning political videos and even those are just tangentially political.

    • nutsack@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      edit-2
      1 year ago

      i think if you like economics or fast cars you will also get radical right wing talk videos. if you like guns it’s even worse.

      • Edgelord_Of_Tomorrow@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Oh, you like WW2 documentaries about how liberal democracy crushed fascism strategically, industrially, scientifically and morally?

        Well you might enjoy these videos made by actual Nazis complaining about gender neutral bathrooms!

      • ShittyRedditWasBetter@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        5
        ·
        1 year ago

        Nah. Cars or money has nothing to do with it. I’ve never once gotten any political bullshit and those two topics are 60% of what I watch.

      • MrScottyTay@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I started getting into Motorsport recently. I just get the ID video essay on racing and videos similar to top gear like overdrive. I don’t get any right wing stuff or guns. But I’m also in the UK so it probably uses that too. For American maybe it’s like “ah other Americans that line fast cars also like guns, here you go”

    • Kuya@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 year ago

      I’ve been watching tutorials on jump ropes and kickboxing. I do watch YouTube shorts, but lately I’m being shown Andrew Tate stuff. I didn’t skip it quick enough, now 10% of the things I see are right leaning bot created contents. Slowly gun related, self defense, and Minecraft are taking over my YouTube shorts.

      • Ech@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I like a few Minecraft channels, but I only watch it in private tabs because I know yt will flood my account with it if I’m not careful. There is no middle ground with The Algorithm.

        • MrScottyTay@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Yeah it’s too much skewed by recent viewing. Even if you’re subscribed to X amount of channels about topic Y but you just watched one video on topic Z, then say goodbye to Y, you only like Z now.

          • Asymptote@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            I’m hopefully wrong because it might have GDPR implications, but on the other hand they’re probably using an LLm and once theyre on the back of that tiger they cnt let go of the tail.

      • djmarcone@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Yeah I didn’t even know who he was until a few months ago. Yet he is the top channel.

        YouTube/Google knows who you are and has a profile of you and your interests no matter what you do.

        You have to truly obfuscate your identity to escape it.

        By getting butthurt from seeing objectionable content it is still interaction and the algorithm links it to you. Your likes and dislikes are both part of your identity and they know that and use it.

        Because interaction with the website is all that matters, happy or angry they don’t care.

        In fact, they probably prefer you to be butthurt because you are more engaged.

    • spacebirb@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      I know everyone likes to be conspiracy on this but it’s really just trying to get your attention any way possible. There’s more right wing popular political videos, so the algorithm is more likely to suggest them. These videos also get lots of views so again, more likely to be suggested.

      Just ignore them and watch what you like

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        I’ve already said I installed a channel blocker to deal with the problem, but it’s still annoying that a computer has me in their database as liking right wing shit. If it was limited to just youtube recommendations, it would be nothing, but we’re on a slow burn to a dystopian hell. Google has no reason not to use their personality profile of me elsewhere.

        I made this comment elsewhere, but I have a very liberal friend who’s German, likes German food, and is into wwii era history. Facebook was suggesting neo-nazi groups to him.

        I watch a little flashgitz and now I’m being recommended FreedomToons. I get that’s some people that like flashgitz are going to be terrible, but I shouldn’t have to click Not Interested more then once.

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’m sure YouTube hangs on to that data even if you delete the history. I would guess that since you don’t watch left wing videos much their algorithm still thinks you are politically right of center? Although I would have expected it to just give up recommending political channels altogether at some point. I hardly ever get recommendations for political stuff, and right wing content is the minority of that

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I watch some left wing stuff, but I prefer my politics to be in text form. Too much dramatic music and manipulative editing even in things I agree with. The algorithm should see me as center left if anything, but because I watch some redneck engineering videos(that I ditch if they do get political), it seems to think I should also like transphobic videos.

    • Brokkr@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      1 year ago

      Indicating “not interested” shows engagement on your part. Therefore the algorithm provides you with more content like that so that you will engage more.

      You can try blocking the channel, which has mixed results for the same reason, or closing youtube and staying away from it for a few hours on that account.

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 year ago

        If I click Not Interested increases the likely hood of getting more of the same, then all the more reason to run ad blockers.

        The Channel Blocker is a 3rd party tool. It just hides the channel from view. Google shouldn’t know I’m doing it.

      • jrburkh@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I don’t know if this is accurate or not, but it’s the most nonsensical thing I’ve heard in a while. If engaging with something to say, “I don’t want to see this,” results in more of that content - the user will eventually leave the platform. I’m having this concern right now with my Google feed. I keep clicking not interested, yet continue getting similar content. Consequently, I’m increasingly leaning toward disabling the functionality because I’m tired of fucking seeing shit I don’t care to see. Getting angry just thinking about it.

        • Brokkr@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I can only offer my own experience as evidence, but this is what I was advised to do (stop engaging by not selecting anything) and it worked. Prior to that I kept getting tons of stuff that I didn’t want to see, but it stopped within a few days once I stopped engaging with it. And I agree, it is infuriating.

          Because I got this advice from someone else, I guess it has worked for others too.

    • megalodon@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I’m bombarded by Joe Rogan stuff. I keep blocking the channels but there is an endless stream of them

      • Wolpertinger@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I always downvote, then block the channel whenever I get those. However, I think the mere act of going to the button to block the channel instead of just scrolling on immediately is telling the algorithm that I want more of that kind of video.

        Watching a bit of breadtube stuff, I feel like thr algorithm can’t determine what video is against stuff like that and what’s for, so I get recommended videos for whatever I don’t like instead of against.

        • invisinak@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          that might actually be the issue. YouTube doesn’t differentiate between upvotes and downvotes really. to the algorithm you’re engaging in the content either way so it’s serving you more to keep you engaged.

        • 5BC2E7@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I am also suspecting that downvoting or blocking is somehow interpreted as “engaged with the content so lets shove more of it”

  • dylanTheDeveloper@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    1 year ago

    I keep getting ‘rescue’ animal videos which involve people purposely putting puppies and kittens in destressing situations so they can ‘save them’ its sick and no matter how often i block and report those videos they re-appear next month. I also get alot of ‘police shooting people’ videos which i also try to block

    • regbin_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I think it’s just a matter of fine tuning your preferences. I’ve never an irrelevant video recommended to me within the last few years. All the recommendations have been great. Retro hardware reviews, video game gameplay guides, science videos, and other informational/engineering stuff.

  • qwamqwamqwam@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    5
    ·
    edit-2
    1 year ago

    Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:

    “We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”

    That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.

    I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.

    • TimewornTraveler@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      From the quoted bit it sounds like there was credible science that found nothing. That doesn’t mean there is nothing, but just that they found nothing.

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      There are many instances that have defederated them that you could join. Or if you’re really serious you could host your own.

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    If it’s true that they have closed the radicalization rabbit hole then that is a huge achievement and very very good news.

    • Edgelord_Of_Tomorrow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Now that they’ve entrenched an entire alternative universe in an election-winning proportion of the population, they don’t need it anymore.

      Unless YouTube is going to be deliberately directing people to deprogramming content it’s too late.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        A lot of damage is done, certainly, but I think any success they have will depend on keeping up this bullshit. New voters are growing up all the time. The less chance for them to fall down the QAnon conspiracy after they just wanted to find some video game guide content, the better.

  • inspxtr@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    I was aware of this study when they presented it virtually (can’t remember where), and while I don’t have an issue with their approach and results, I’m more concerned about the implications of these numbers. The few percent that were exposed to extremist content may seen small. But scaling that up to population level, personally that is worrisome to me … The impact of the few very very bad apples can still catastrophic.

  • Anders429@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    1 year ago

    Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

  • AsunasPersonalAsst@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    1 year ago

    Never had this problem, never used YT while logged on. Just incognito and careful search keywords. If the algo recommended something sus, immediately close your incognito session and open a new one.

  • intensely_human@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    15
    ·
    1 year ago

    Why do we need to know what happened before? A record of the past is just material radicals can use to radicalize others.

  • Cam@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    24
    ·
    1 year ago

    What a load of dog dung that article is. Justifying censorship, labelling everything that is not liked by some politicial “expert” as far right extreme.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      When an algorithm is involved, things change. These aren’t static websites that only get passed around by real people. This is some bizarre pseudo intelligence that thinks if you like WWII history and bratwurst, that you’d also like neo-nazi content. That’s not an exaggeration. One of my very left leaning friends started getting neo-nazi videos suggested to him and I suspect it was for those reasons.

      Also, youtube isn’t a free speech platform. It’s an advertisement platform. Fediverse is a free speech platform, although it’s free speech for the person paying the hosting bills.

    • vector_zero@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      15
      ·
      1 year ago

      Censorship of online content is good, but simultaneously the censorship of sexually explicit books in elementary schools is evil.

      Neat.