Hey everyone! I’m finally looking to build a dedicated workstation for my deep learning projects, but I’m a bit overwhelmed by the current GPU market. My current setup is constantly hitting out-of-memory errors when I try to train larger Transformers or CNNs, so I know I need something with significant VRAM—ideally 16GB or more. I’ve been eyeing the RTX 3090 or the newer 4090, but I’m also wondering if it’s worth looking at the professional A-series cards instead. Budget is around $1,500-$2,000, and I really want to prioritize CUDA support and future-proofing. If you were building a rig today, which specific GPU would you choose for the best performance-to-price ratio?
Honestly, if I were you, I'd go with the NVIDIA GeForce RTX 4090 24GB cuz it's literally AMAZING for Transformers! But be careful with the power draw, it's scary!! Compared to the NVIDIA RTX A5000 24GB, the 4090 is way faster for the price, tho the A-series is safer for 24/7 runs. Ngl, I'm a bit nervous about those 4090 power spikes, but the speed is just fantastic, you know?
Seconding the recommendation above! Ngl, the NVIDIA GeForce RTX 4090 24GB is the dream, but it'll eat your whole budget. In my experience, a used NVIDIA GeForce RTX 3090 24GB for ~$800 is a steal. Just pleaaase don't skimp on the power supply. Those cards have massive spikes, so I'd only run it with a Corsair RM1000x 1000W 80 PLUS Gold to stay safe and avoid frying your motherboard!
> If you were building a rig today, which specific GPU would you choose for the best performance-to-price ratio? Stumbled on this thread today and had a thought about long-term reliability. Before you pull the trigger, what kind of case and airflow setup are you looking at? Basically, are you planning to stick with one card or maybe add a second one in a year or two? Knowing the motherboard spacing and clearance is huge because those high-end cards are incredibly thick now. Quick tips:
sooo, first off, honestly avoid the professional A-series cards. they're crazy expensive and basically youre paying for enterprise support you wont ever use. it'll blow your whole budget just for the name on the box... which is kinda annoying.
Here's what I recommend after trying a few different setups over the years:
1. Get the latest consumer flagship card with 24GB of VRAM. i think its the best performance-to-price ratio right now for a home rig.
2. Transformers are VRAM hogs, so that 24GB buffer is literally a life saver. I used to get so frustrated with memory errors but now it just works for most of my tasks.
3. Stick with the green team cards because CUDA support is just so much easier to deal with than anything else.
btw, what power supply are you planning to use?? cuz those high-end cards pull a ton of power. anyway, gl!
Bump - same question here
Saw this earlier... Curious about one thing: what's ur PSU wattage? Before you decide:
* NVIDIA GeForce RTX 4090 24GB
it's honestly the best for VRAM, but seriously be careful with power!
Solid advice 👍
Saving this whole thread. So much good info here you guys are awesome.