Is 24GB VRAM enough...
 
Notifications
Clear all

Is 24GB VRAM enough for stable diffusion and AI training?

6 Posts
7 Users
0 Reactions
42 Views
0
Topic starter

Is 24gb vram actually enough for getting serious with stable diffusion and training or am I gonna hit a wall faster than I think? Im currently stuck between two main paths and honestly cant decide which way to go. I can pick up a used 3090 for a decent price right now like maybe $750 which seems like a steal for the vram but then I see the 4090 prices and wonder if the speed boost is worth the massive jump in cost. My budget is pretty firm at $1600 for the whole upgrade and im located in Austin so I can usually find stuff locally but the 4090 is basically my entire budget. Im trying to get a workflow going for an indie game I'm working on and I need to train a bunch of custom loras for consistent characters. My main concern is if 24gb is gonna be obsolete soon with stuff like Flux and these massive new models coming out every week it feels like. Like is it better to buy the cheaper 3090 now and upgrade later or just go all in on the 4090 and hope it lasts me a couple years? Or should I wait for the 50 series to drop and hope prices for the 24gb cards fall even more? I need to have this set up by the end of next month so I can start production...


6 Answers
11

^ This. Also, im super satisfied with 24gb right now. My MSI GeForce RTX 3090 Suprim X 24G works well and handled everything i threw at it. Quick tip tho: watch your power delivery. These cards have massive transient spikes. Pairing it with a Seasonic PRIME TX-1000 1000W 80+ Titanium is basically mandatory for reliability. High quality rails keep your LoRA training runs from failing when the card hits peak load.


11

Just caught this thread today. Honestly, the market for high-end cards is pretty disappointing right now if youre on a budget. 24gb is definitely the minimum you need for Flux, but honestly, blowing 1600 on a single card is a total mistake. I tried squeezing a 4090 into a budget build once and it just wasnt as good as expected... you end up cutting corners on the cooling or PSU. Unfortunately, local prices in Austin arent even that great lately. If youre doing serious indie production, maybe dont go full DIY on the hardware. I picked up a Gigabyte GeForce RTX 3090 Gaming OC 24GB used and it works, but training speeds are kinda mid for the price. Youre probably better off getting that 3090 and a reliable Corsair RM1000e 1000W 80+ Gold then using the leftover 800 bucks for cloud rendering credits. DIY is fun until you realize a professional service can finish your LoRA in twenty minutes while you sleep.


4

Regarding what #4 said about 24gb... my ZOTAC Gaming GeForce RTX 3090 Trinity OC 24GB handles Flux fine, but 4090 speeds are a decent jump if your production deadline is that tight.


3

Re: "Honestly, 24gb is the sweet spot right now..."

  • I actually hit a wall super fast!
  • My setup lagged.
  • Training failed twice. This tech is amazing but 24gb felt tight ngl!


2

Just catching up on this thread and wow, I love the energy here! Training LoRAs is honestly such a game changer for indie dev work, it's totally amazing once you get it dialed in. I remember when I first started, I was trying to squeeze everything onto a tiny card and it was a total nightmare until I finally upgraded to the EVGA GeForce RTX 3090 FTW3 Ultra Gaming 24GB. It felt like I finally had room to breathe! But quick question, what kind of resolutions are you aiming for? Are we talking small sprites or high-res 1024px assets for Flux? If your $1600 budget has to cover a new PSU or RAM too, the NVIDIA GeForce RTX 4090 Founders Edition 24GB is gonna blow the whole bank. I found that getting a used 3090 and spending the leftover cash on a beefy Corsair RM1000x 1000W 80 PLUS Gold was much safer for my setup.


1

Honestly, 24gb is the sweet spot right now and you wont be hitting a wall anytime soon. Ive been using the NVIDIA GeForce RTX 3090 24GB VRAM for a while now and im super satisfied with how it handles training. I do Flux and SDXL models all day long without any memory errors. Since your budget is tight, grabbing that used 3090 for $750 is the smartest move. It leaves you plenty of cash for a decent CPU and enough RAM to support the build. Sure, the NVIDIA GeForce RTX 4090 24GB VRAM is a total beast and cuts down generation times, but for LoRA training? The 3090 works perfectly well and gets the job done. If you need to be up and running by next month, dont wait for the 50 series. Grab the 3090 locally and get to work on that game. Youll be happy with the performance, trust me.


Share: