Is Attention Obsolete?
Dec 31, 2025 · 6 min read · Grassmann Flows are a new “attention‑free” way to build models like Transformers, using geometry instead of the usual self‑attention matrix, while still handling long sequences efficiently and (potentially) more interpretably. Self‑attention sits at ...
Join discussion