Decoding AI Jargons with Nikhil
Apr 8, 2025 · 3 min read · Tokenization :- Tokenization is the process of turning text into a sequence of tokens, which can be words, subwords, or characters.. LLM can understand hindi, english, hinglish in reality what happen all ai model have there own tokenization mechanism...
AANKIT commented



