AlphaFold’s success seems to be largely linked to its use of attention-based architecture, similar to GPT, i.e. the architecture used by LLMs. Beyond that, they are both building on work in machine learning and statistics, so I don’t think they are nearly as independent as you are making out.
Yeah, but LLM innovation now is not in more clever architectures, but rather larger and larger models with more training data.
I don’t hate the existence of LLMs but rather how they’re being shoehorned everywhere and how much power is being spent for just a little bit better results.
AlphaFold’s success seems to be largely linked to its use of attention-based architecture, similar to GPT, i.e. the architecture used by LLMs. Beyond that, they are both building on work in machine learning and statistics, so I don’t think they are nearly as independent as you are making out.
Yeah, but LLM innovation now is not in more clever architectures, but rather larger and larger models with more training data.
I don’t hate the existence of LLMs but rather how they’re being shoehorned everywhere and how much power is being spent for just a little bit better results.