They are so desperate for AI to work out.
“Oh we just need more compute and all the problems will go away.” There are almost certainly diminishing returns on additional compute for these LLMs and stable diffusion engines, but they won’t hear it.
Instead it’s just “We just need more compute, just give us more compute, fire up every decommissioned powerplant in the country and steal everyone’s drinking water, just a little more compute and AI will finally work!”
It’s definitely starting to look like we’re well into diminishing returns territory with the current LLM approach. I think GPT algorithm is a useful piece of the puzzle, but clearly some fresh ideas are needed to move forward in a meaningful way. I’m really hoping the hype will start to die down so that more effort goes into exploring other techniques instead of all the air in the room being sucked up by LLMs.
Colorado is home to 56 data centers, all located along the Front Range. The vast majority of the facilities are in metro Denver, including a massive 177-megawatt hyperscale facility under construction by QTS Realty Trust in Aurora - https://www.govtech.com/policy/proposed-data-center-tax-break-sparks-debate-in-denver
My water bill is increasing 30% this year and we met our snowpack targets last winter. This city’s population can only increase by around 50k more people before there’s a hard limit on our water infrastructure and every booming town near those data centres- 80% of the state’s population- has to compete for that water along with the large landowners. Those rivers are tributaries of the Mississippi and Colorado Rivers supplying some of the most productive farmland along the way.
All for bazinga machines. That classic apocalypse scenario of the paper clip machine turning everything into paper clips at least results in all the matter in the universe being converted to something useful, but we’re doing that so that eugenicist vampires can make shitty chat bot and dog money-themed pyramid schemes. We can’t declare jihad on thinking machines soon enough.