Command Palette
Search for a command to run...
Tokenization
Date
Tokenization, also known as lexical analysis, is the process of converting characters into tokens (strings with associated identifying meanings). The program that performs lexical analysis is also called a lexical analyzer, a tokenizer, or a scanner, but a scanner is only a term for the first stage of a lexical analyzer. Lexical analyzers are usually used in combination with parsers, where parsers are mainly used to analyze the syntax of programming languages, web pages, etc.
Tokenization is the process of dividing and classifying parts of an input string and then passing the resulting tokens to some other form of processing, which can be considered a subtask of parsing the input.
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.