Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
datasets:
|
| 4 |
+
- EleutherAI/pile
|
| 5 |
+
language:
|
| 6 |
+
- en
|
| 7 |
+
---
|
| 8 |
+
|
| 9 |
+
These SAEs were trained on the outputs of each of the MLPs in [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m). We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.
|