• piecat@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    8
    ·
    2 years ago

    See, I would argue the exact opposite. It sounds like you don’t understand how it works.

    Because it’s not “replication” or “copying”.

    • BURN@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      5
      ·
      2 years ago

      Most LLMs can be made to spit out training data. That’s pretty much replication in my book.

      Statistical models don’t create anything. They replicate variations of their training data.

      • zwaetschgeraeuber@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 years ago

        when you read something and recite it, what do you do? exactly, spitting out the training data, if you trained long enough

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        6
        ·
        2 years ago

        Painters replicate variations of their training pieces too. You’re pretending there’s a difference between human inspired and training inspired and that you should get paid for that inspiration in one case just cuz “big corp”

        • BURN@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          2 years ago

          Because there is a difference. A computer does not learn or understand anything. Human beings can transform a concept. A LLM or other generative AI does not transform a concept at all.