Sign in
Log inSign up
Eric Hartford

14 likes

·

16.5K reads

6 comments

Greg
Greg
Feb 7, 2024

I am desperately trying to rationalize getting a new computer capable of a local hosting. What specs are we talking about for being able to do this in the way you describe in this article? The main use case is writing a novel. I'm not sure exactly what approach I would take (or the correct terminology for techniques) but one thing I would like to do is to have it trained or fine-tuned with expert resource information on literature (e.g., examples of literature and guidance on the art of writing).

1
·
·3 replies
Eric Hartford
Eric Hartford
Author
·Feb 7, 2024
  • dual-3090 + sli - 48gb vram - ~$3000 (7b-34b inference, maybe tiny finetuning)
  • quad-mi100 + infinity fabric - 128gb VRAM, ~$6000 (good for finetuning and inference)
  • m3 max 128gb - bad for finetuning, great for single-user inference. - ~$6000
1
·
Greg
Greg
Feb 7, 2024

Eric Hartford Thanks for taking the time to provide that information. Very useful

·
Nelene Soldaat
Nelene Soldaat
May 26, 2024

Eric Hartford gosh is this what it costs?

Totally new and just subscribed and forgive me if this is a daft question but can one run something like this in the cloud and provide access to others?

·
Jiwon Park
Jiwon Park
Dec 5, 2023

Amazing, makes me appreciate open source more. We recently did a survey of favorite open source projects within the team just for fun - langchain was #1, autogen #2, ollama #3, memgpt#7, litellm #14!! So many open source playgrounds for us to play with different models :)

·
birdme l
birdme l
Dec 16, 2023

installing ollama fellow this article。。。。。。。

·