Blogs
-
Serving AI From The Basement: 192GB of VRAM Setup
AI from The Basement: My latest side project, a dedicated LLM server powered by 8x RTX 3090 Graphic Cards, boasting a total of 192GB of VRAM. I built this with running Meta’s Llamma-3.1 405B in mind.