Update README.md
Browse files
README.md
CHANGED
|
@@ -4,3 +4,25 @@ license_name: deepseek
|
|
| 4 |
license_link: >-
|
| 5 |
https://huggingface.co/deepseek-ai/deepseek-coder-33b-instruct/blob/main/LICENSE
|
| 6 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
license_link: >-
|
| 5 |
https://huggingface.co/deepseek-ai/deepseek-coder-33b-instruct/blob/main/LICENSE
|
| 6 |
---
|
| 7 |
+
|
| 8 |
+
This is a llamafile for [deepseek-coder-33b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-33b-instruct).
|
| 9 |
+
|
| 10 |
+
The quantized gguf was downloaded straight from [TheBloke](https://huggingface.co/TheBloke/deepseek-coder-33B-instruct-GGUF),
|
| 11 |
+
and then zipped into a llamafile using [Mozilla's awesome project](https://github.com/Mozilla-Ocho/llamafile).
|
| 12 |
+
|
| 13 |
+
It's over 4gb so if you want to use it on Windows you'll have to run it from WSL.
|
| 14 |
+
|
| 15 |
+
WSL note: If you get the error about APE, and the recommended command
|
| 16 |
+
|
| 17 |
+
`sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'`
|
| 18 |
+
|
| 19 |
+
doesn't work, the file might be named something else so I had success with
|
| 20 |
+
|
| 21 |
+
`sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop-late'`
|
| 22 |
+
|
| 23 |
+
If that fails too, just navigate to `/proc/sys/fs/binfmt_msc` and see what files look like `WSLInterop` and echo a -1 to whatever it's called by changing that part of the recommended command.
|
| 24 |
+
|
| 25 |
+
|
| 26 |
+
Llamafiles are a standalone executable that run an LLM server locally on a variety of operating systems.
|
| 27 |
+
You just run it, open the chat interface in a browser, and interact.
|
| 28 |
+
Options can be passed in to expose the api etc.
|