Local LLM on Mac and access it via iPhone (Securely)
Feb 6 · 4 min read · Why I Wanted to Do This I wanted to see if I could: Run a large language model locally Use it outside my Mac, especially from my iPhone Keep everything private and under my control Avoid exposing any service to the public internet Still keep the...
Join discussion