• grindemup@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 hours ago

    AlphaFold’s success seems to be largely linked to its use of attention-based architecture, similar to GPT, i.e. the architecture used by LLMs. Beyond that, they are both building on work in machine learning and statistics, so I don’t think they are nearly as independent as you are making out.

    • boonhet@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      Yeah, but LLM innovation now is not in more clever architectures, but rather larger and larger models with more training data.

      I don’t hate the existence of LLMs but rather how they’re being shoehorned everywhere and how much power is being spent for just a little bit better results.