son of COT->Self-Consistency
If you’ve spent any time prompting large language models in 2025–2026, you’ve probably used (or at least tried) Chain-of-Thought (CoT) prompting.
You give the model a few examples that show step-by-st
karthiknadar1204.hashnode.dev4 min read