I assume there must be a reason why sign language is superior but I genuinely don’t know why.

  • Hildegarde@lemmy.world
    link
    fedilink
    arrow-up
    53
    arrow-down
    1
    ·
    11 months ago

    American sign language is not a gesture based form of English. It is an entire language in its own right, with its own distinct grammar and vocabulary.

    To someone deaf from birth, sign language is their native language. And it is much more comfortable to quickly read your native language than a second language.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      20
      arrow-down
      2
      ·
      11 months ago

      This raises more questions than it answers, like how do the deaf from birth function in society at all if they struggle with other languages besides sign language. How do they get a job, go to school, learn new skills, read the news, text people? What do they do in their leisure if not watching subtitles movies or reading books? Many non-english speakers end up learning English anyway because of just how pervasive it is.

      • sparr@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        11 months ago

        The same way anyone else for whom English is a second or third language function in society.

        • Gabu@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          11 months ago

          I’m ESL and use English subtitles when watching a programme in a language I can’t speak…

          • captainlezbian@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            You’re on the internet. Most Deaf people these days read English fluently. It’s just that Deaf 70 year olds were often able to get away without becoming actively fluent in English and may not have felt the need to. Closed captions are younger than most people think and they fucking sucked fairly recently. I grew up watching the news with captions and it was distracting if you didn’t need it. Big black boxes with the words said a few seconds ago rapidly appearing on them as they covered stuff. And often captions on prerecorded content wasn’t much better. It was an accessibility feature and treated as such. Technology connections has a great video on closed captions that was almost nostalgic lol.

            Then there’s also the mood. If you grew up with tv that had captions you’re used to it. But before captions we had terps (interpreters). At live events we have them. At a government press release they already needed one because they can’t just show the teleprompter to the Deaf people in the audience. So they just show the terp where we expect to see them on the screen. Like I can’t think of an event on tv that has interpreters that doesn’t need one in person.

      • detalferous@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        11 months ago

        Think about written English: it’s phonetic.

        How do you learn to RECOGNIZE A WRITTEN WORD when you don’t know what it sounds like, let alone what the letters mean. Or becomes a matter of a hundred thousand different symbols, recognized as a unit, removed from the auditory context.

        I can’t imagine how any deaf person learns to read, to be honest . It’s an astounding feat.

        • fidodo@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          11 months ago

          Don’t you just recognize the sequences? There are plenty of non phonic languages, you just recognize patterns instead of sounds.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        7
        ·
        11 months ago

        Which is why we give deaf students extra attention in schools now…

        The issue is the deaf community was forced to be insular for most of American history. And part of that included the stereotype of “deaf and dumb” where if a person was deaf, they were assumed to be stupid.

        And some older members of that community see the next generation being treated more inclusivly as a negative, because that means their community will shrink if people aren’t forced to only interact with other deaf people. They don’t want integration into the larger community, and they want to force future generations to be segregated as well.

        And theyre kind of right. Most of the people with that line of thought aren’t people you’d want to voluntarily associate with. Wanting to hobble the next generation so you don’t feel lonely is pretty low.

        • captainlezbian@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          11 months ago

          I want to give the flip side here. They used to separate us. There was active division based on how bad your hearing was. If you couldn’t hear with effort and hearing aids you were shunted away as a lost cause. But if you could they would tell your parents not to teach you sign language because you’d prefer it. That’s how me, my sister, my mom, and my grandma were all denied our right to a native language that would’ve been easier for us. They didn’t care that by 60 we’d be deaf as a post because our hearing loss was genetic and degenerative. All English all the time, and no acknowledgment that it took effort to hear.

          The Deaf community can be insular assholes, but I understand it. Our culture is denied to us. Our language is denied to us. And maybe, just maybe part of why we’re oissed off is because we have some points that nobody wants to acknowledge. Like the fact that cochlear implants aren’t some miracle, they’re great, and my grandparents love theirs, but they’re fucking exhausting to use. Hell, I’m a healthy 28 year old and I have to take my hearing aids out after work because they tire me. And for children born too deaf to use auditory communication with hearing parents, it’s disturbing how few of those parents learn sign language. But every CODA (child of deaf adults) is taught spoken language (and they tend to maintain lifelong ties to Deaf culture)

          I’m still mad I wasn’t taught sign as a kid. I’m glad I was given hearing aids but I deserved access to community like me. And if I’d reproduced I would’ve made damn sure my kid was a native signer so that way they’d never grow up in fear of inevitable silence or awkwardly fail to communicate with people who share their disability.

      • captainlezbian@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        You’ve hit on a problem that the Deaf community faces. There’s often an entire Deaf society in places. Deaf jobs, Deaf schools (including universities), Deaf media… They do read English but it’s harder and it’s not their primary language (though I’ve heard the internet is helping a lot there).

        But yeah, there are Deaf universities, including prestigious ones like Gallaudet. Nobody teaches medicine or engineering in sign language from what I can see. I did check and I was pleasantly surprised that Gallaudet offered shit like math, biology, and IT with even grad programs in stuff that isn’t explicitly about deafness.

  • Droggelbecher@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    3
    ·
    11 months ago

    If sign language is your first language, any written language is like a foreign language that you might’ve learned but aren’t a native speaker in.

    • lagomorphlecture@lemm.ee
      link
      fedilink
      arrow-up
      17
      ·
      11 months ago

      ASL (or whichever sign language) is NOT a direct visual translation of English or French or Mandarin or whatever. It’s a totally different language and the written language is a second language. People might be highly proficient at reading and writing English in an English speaking country but it’s a different language.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        11 months ago

        And incredibly regional as well.

        Any isolated language with a small local population is going to differentiate quickly, and while the Internet is bringing everyone together and making written language more consistent, it’s not like deaf people send each other videos online, they just use written English because it’s insanely easier and faster for everyone.

  • juliebean@lemm.ee
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    11 months ago

    i am not Deaf, but i imagine it is easier having stuff presented in your native language.

  • captainlezbian@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    11 months ago

    Because it’s some people’s native language and for those people English (or whatever is the spoken/written language of the land) is their second language. Sign languages aren’t using hands to communicate in their original language, those do exist like ESL (English as a Signed Language) but the Deaf in America and England don’t use ESL, we use ASL (American Sign Language) and BSL (British Sign Language) respectively. These are very different languages from each other and ESL. They don’t even share fingerspelling alphabets.

    Captions are amazing for the hard of hearing and late deafened, especially since many children such as myself who grew up hard of hearing were denied sign. But it’s my language by right and I was denied it as a native language. It’s natural for face to face communication in a way writing isn’t and it’s also a cultural language. A Deaf five year old can understand the news broadcast in sign language just as well as a hearing one can understand the spoken one.

  • hglman@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    It’s also simpler, faster, and more accurate to have a live translator than having some one type.

  • Microw@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    Beides being more natural to follow for native Sign “speakers” (do you say Sneakers? No idea), at live broadcasts it is way more efficient than live subtitling

  • KestrelAlex@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    11 months ago

    ASL has very different structure to spoken/written English, so not everybody who signs is going to comprehend English grammar as fluently/easily or the nuance of all the words that don’t have a sign equivalent.

    Additionally ASL communicated who is talking and the tone of their words, even when the speaker is off screen, which just can’t be captured by captioning. Closed captioning has just caught on to using slightly different colors to indicate the speaker, so you know who’s talking offscreen. I’ve only seen this in British panel shows so far but it’s helpful.