Decoding AI Jargons with Nikhil
Tokenization :-
Tokenization is the process of turning text into a sequence of tokens, which can be words, subwords, or characters.. LLM can understand hindi, english, hinglish in reality what happen all ai model have there own tokenization mechanism...
nikhilkumar007.hashnode.dev3 min read
ANKIT KUMAR
Nicely written Nikhil bro!