RandomlyRight@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agoI'm sorry, little onesh.itjust.worksimagemessage-square5fedilinkarrow-up1230arrow-down112
arrow-up1218arrow-down1imageI'm sorry, little onesh.itjust.worksRandomlyRight@sh.itjust.works to Lemmy Shitpost@lemmy.world · 1 year agomessage-square5fedilink
minus-squaremergingapples@lemmy.worldlinkfedilinkarrow-up11arrow-down1·1 year agoBecause those specific cards are fuckloads more expensive.
minus-squared00ery@lemmy.worldlinkfedilinkarrow-up3·1 year agoWhat are you recommending, I’d be interested in something that’s similar in price to 3090.
minus-squareDiabolo96@lemmy.dbzer0.comlinkfedilinkarrow-up3·1 year agoIt’s for inference, not training.
minus-squareGBU_28@lemm.eelinkfedilinkEnglisharrow-up2·1 year agoHuh? Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.
deleted by creator
Because those specific cards are fuckloads more expensive.
What are you recommending, I’d be interested in something that’s similar in price to 3090.
It’s for inference, not training.
deleted by creator
But can they run Crysis ?
Huh?
Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.