14 likes
·
16.5K reads
6 comments
I am desperately trying to rationalize getting a new computer capable of a local hosting. What specs are we talking about for being able to do this in the way you describe in this article? The main use case is writing a novel. I'm not sure exactly what approach I would take (or the correct terminology for techniques) but one thing I would like to do is to have it trained or fine-tuned with expert resource information on literature (e.g., examples of literature and guidance on the art of writing).
- dual-3090 + sli - 48gb vram - ~$3000 (7b-34b inference, maybe tiny finetuning)
- quad-mi100 + infinity fabric - 128gb VRAM, ~$6000 (good for finetuning and inference)
- m3 max 128gb - bad for finetuning, great for single-user inference. - ~$6000
Eric Hartford Thanks for taking the time to provide that information. Very useful
Eric Hartford gosh is this what it costs?
Totally new and just subscribed and forgive me if this is a daft question but can one run something like this in the cloud and provide access to others?
Amazing, makes me appreciate open source more. We recently did a survey of favorite open source projects within the team just for fun - langchain was #1, autogen #2, ollama #3, memgpt#7, litellm #14!! So many open source playgrounds for us to play with different models :)
installing ollama fellow this article。。。。。。。