This resonates deeply. We hit the same wall building automation workflows — a single monolithic prompt trying to handle every edge case becomes brittle fast. The shift to composable skills is essentially the same principle as microservices vs monoliths, but for LLM orchestration. One pattern that's worked well: treating each skill as a self-contained unit with its own context window budget, clear input/output contracts, and fallback behavior. Instead of one 8k-token mega-prompt, you get five 1k-token focused skills that the orchestrator chains based on intent classification. The debugging story also improves dramatically — when something breaks, you know exactly which skill misfired instead of hunting through a wall of instructions.
BridgeXAPI
This really clicked for me. It kind of reminds me of how backend systems moved away from one big service into smaller pieces that each do one thing well. Also the idea that context is something you have to manage instead of just keep adding to it… that changes how you approach the whole thing. Feels like the hard part isn’t prompting anymore, but how you structure everything around it.