Anshul Tiwaricsanshul.hashnode.devยทSep 9, 2024Prompt Injection: A Simple ExplanationThis is security vulnerability that targets AI and machine learning system.Here malicious prompt manipulates model's behavior.Its aims to get sensitive information or executing unauthorized instructions. Types of Prompt Injections: Direct Prompt Inj...26 readspromptinjectionsAdd a thoughtful commentNo comments yetBe the first to start the conversation.