minus-squarebluemellophone@lemmy.worldtoTechnology@lemmy.world•Intel's $249 Arc B580 is the GPU we've begged for since the pandemic | PCWorldlinkfedilinkEnglisharrow-up3·17 hours agoThere are some smaller Ollama Llama 3.2 models that would fit on 12GB. I’ve run some of the smaller Llama 3.1 models under 10GB on NVIDIA GPUs linkfedilink
There are some smaller Ollama Llama 3.2 models that would fit on 12GB. I’ve run some of the smaller Llama 3.1 models under 10GB on NVIDIA GPUs