Jump to navigation Jump to search
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization (data security) in the field of data security
- Word segmentation
- Tokenism of minorities.
| disambiguation page lists articles associated with the title Tokenization.
If an internal link led you here, you may wish to change the link to point directly to the intended article.