Tokenization
From Wikipedia, the free encyclopedia
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization (data security) in the field of data security
- Word segmentation
| This disambiguation page lists articles associated with the title Tokenization. If an internal link led you here, you may wish to change the link to point directly to the intended article. |