nb-tokenizers-spm / wikipedia /da_32000_bpe.100extra.sp.model
versae's picture
Adding tokenizers models and vocabs
0ed8b17
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
cc6d859329900539434a3a060eefb6576239830ce2f574e3b497872827ea1137
Pointer size:
131 Bytes
·
Size of remote file:
758 kB
·
Xet hash:
de4804d98c294d1802263fa623407fd2b6a52150346237e36c1f7f1ed7b8e760

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.