AI Safety, Hallucinations & Trust in Production
🧠 Learn here
When anyone say an AI “lies”, that does not mean the model has human intent to deceive!In most production systems, this is what we call a hallucination, it is a confident sounding output
bittublog.hashnode.dev8 min read