im really sorry if this is like a super basic thing but im totally lost trying to figure out which nvidia card to buy for this machine learning project i have to do. i keep hearing that VRAM is the most important thing for running models and stuff but the more i look at the specs the more my head hurts. my logic was just to go for the biggest number but then i saw some cards have like 24gb and then others that are way more expensive have like 48 or even 80? i think? i was looking at the 4090 because everyone says it is fast but then someone mentioned something called an A6000 and the price tag literally made me jump lol.
im trying to set up a small station for my sisters marketing business here in san francisco and she wants to run some local image generation stuff by next friday so im on a really tight deadline. i have about $2200 to spend total but if a card with more memory is actually way better i might have to ask her for more cash. should i just stick with the gaming cards or is there a specific one that has the absolute most vram for the money? i really dont want to buy the wrong thing and waste all that money because i dont know what im doing...
I would suggest being careful with your spending tbh. You might want to consider a used NVIDIA GeForce RTX 3090 24GB since it's safer for your budget and still has 24gb vram.
I remember being so happy when I finally got my NVIDIA GeForce RTX 3060 12GB.
Works great for me
Man I wish I found this thread sooner. Would have saved me so much hassle.
Ive been building these rigs for a long time and the VRAM debate never really changes, it just gets more expensive. In my experience, you really have to decide if you want the raw speed of the consumer side or the massive memory pools of the workstation lines.