How to Protect Sensitive Data by Running LLMs Locally with Ollama
Whenever engineers are building AI-powered applications, use of sensitive data is always a top priority. You don't want to send users' data to an external API that you don't control.
For me, this happ
freecodecamp.org11 min read