cm0002@lemmy.world to memes@lemmy.world · 1 day agoIs 8GB a lot? Depends on the context.lemmy.mlimagemessage-square78linkfedilinkarrow-up1509arrow-down115cross-posted to: memes@lemmy.ml
arrow-up1494arrow-down1imageIs 8GB a lot? Depends on the context.lemmy.mlcm0002@lemmy.world to memes@lemmy.world · 1 day agomessage-square78linkfedilinkcross-posted to: memes@lemmy.ml
minus-squareAnivia@feddit.orglinkfedilinkarrow-up3·16 hours ago Afaik for consumers only the 5090 has 32GB VRAM Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI
Only if you don’t count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI