On the new larger context windows
Today the latest LLMs have large context windows up to ~1 million tokens. There are many occasions when this larger context window can be useful:
Context engineering: injecting rich system/user context without juggling state
Long documents: reading...
engineering.fractional.ai2 min read