top of page


Byte-Level BPE
Byte-Level BPE: Unicode-agnostic tokenization. Handles any character and out-of-vocabulary words. Balances efficiency and representation.
Jun 30, 20246 min read


Subword Regularization with BPE
Stochastic tokenization improving robustness. Applicable in BPE pre-training and fine-tuning.Balances consistency and variability.
Jun 30, 20244 min read


Dynamic BPE
Dynamic BPE: Adaptive tokenization for pre-training and fine-tuning. Balances flexibility and consistency.
Jun 30, 20244 min read


BPE Dropout
BPE Dropout: Stochastic subword segmentation. Applies dropout to merges during tokenization. Improves model robustness and generalization.
Jun 30, 20244 min read


WordPiece Tokenization: A BPE Variant
Word Piece Tokenization: Subword segmentation for NLP. Builds vocab from frequent subwords & handles rare words
Jun 28, 202411 min read


Byte Pair Encoding: Cracking the Subword Code
Byte Pair Tokenization: Efficient subword segmentation. Merges frequent character pairs, handles unseen words, scales to sentences.
Jun 28, 20247 min read
bottom of page