• MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      9 months ago

      No, because LLMs are just a mathematical blender with ONE goal in mind: construct a good sentence. They have no thoughts, they have no corrective motion, they just spit out sentences.

      You MIGHT get to passing a Turing test with enough feedback tied in, but then the “conciousness” is specifically coming from the systemic complexity at that point and still very much not the LLMs.