nb-tokenizers-spm / wikipedia /de_32000_bpe.100extra.sp.model
versae's picture
Adding tokenizers models and vocabs
0ed8b17
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
304c9ca7bb2c7581373f6220b5141544ac04119ad155332a7830f5dacb894004
Pointer size:
131 Bytes
·
Size of remote file:
734 kB
·
Xet hash:
7b9e77a621ea9d4ca502f01e6d9fe29badfad749f4e646676816cf78c69ec204

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.