LLaMA 3.2 90B VRAM: How Much Memory Does Fine-tuning Need?
Jan 31, 2025 · 7 min read · Key Highlights Fine-tuning the LLaMA 3.2 90B model requires at least 180 GB of VRAM, making it challenging for local setups. Memory limitations can make fine-tuning LLaMA 3.2 90B challenging. Parameter-efficient fine-tuning (PEFT) methods like LoRA a...
Join discussion
