nb-tokenizers-spm / wikipedia /de_64000_bpe.sp.model
versae's picture
Adding tokenizers models and vocabs
0ed8b17
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
cadd45d3847e43fc33f3bf1fd9f694e5fba65ccdf46644f1ce4ee947cef0caa4
Pointer size:
132 Bytes
·
Size of remote file:
1.31 MB
·
Xet hash:
697e4acbd718f25222ad8642c7e9fbedaef57ee1f9dc0e9dc32a5c8154de645b

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.