HUGE 🔥 LLaMA 2 with 32K Context Length
Jul 30, 2023 · 8 min read · LLaMA 2 model with a 32k context window is here, yes you are right, 32,000 tokens can be inputted or either be outputted, you can generate or give as an input. This is a huge achievement and accomplishment in the open source world because now the sho...
Join discussion