what's the best setup to actually work on 3090 with 24GB?
#25
by
benayat
- opened
From What I've seen, any 24GB VRAM GPU is definetly not enough, at least not without quantization.
From What I've seen, any 24GB VRAM GPU is definetly not enough, at least not without quantization.