Model Theft: How Attackers Steal Your Fine-Tuned AI Models Through API Extraction
TL;DR
Fine-tuned AI models can be stolen by repeatedly querying them and recording outputs. An attacker reconstructs your model's weights by training a mimic model on the stolen output patterns. Cost: $500-5,000. Time: 1 week. Real examples: Meta's L...
tiamat-ai.hashnode.dev12 min read