Update README.md
Browse files
README.md
CHANGED
|
@@ -1,190 +1,49 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
| 4 |
|
| 5 |
-
|
| 6 |
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
> **! Important Note !**
|
| 10 |
-
>
|
| 11 |
-
> The following is the HF transformers implementation of the RWKV-5 Eagle 7B model. **This is meant to be used with the huggingface transformers**
|
| 12 |
-
>
|
| 13 |
-
> For the full model weights on its own, to use with other RWKV libraries, refer to [here](https://huggingface.co/RWKV/v5-Eagle-7B)
|
| 14 |
-
>
|
| 15 |
-
> This is not an instruct tune model! (soon...)
|
| 16 |
-
|
| 17 |
-
- [HF Demo](https://huggingface.co/spaces/BlinkDL/RWKV-Gradio-2)
|
| 18 |
-
- [Our wiki](https://wiki.rwkv.com)
|
| 19 |
-
- [pth model weights](https://huggingface.co/RWKV/v5-Eagle-7B)
|
| 20 |
-
|
| 21 |
-
#### Running on CPU via HF transformers
|
| 22 |
-
|
| 23 |
-
```python
|
| 24 |
-
import torch
|
| 25 |
-
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 26 |
-
|
| 27 |
-
def generate_prompt(instruction, input=""):
|
| 28 |
-
instruction = instruction.strip().replace('\r\n','\n').replace('\n\n','\n')
|
| 29 |
-
input = input.strip().replace('\r\n','\n').replace('\n\n','\n')
|
| 30 |
-
if input:
|
| 31 |
-
return f"""Instruction: {instruction}
|
| 32 |
-
|
| 33 |
-
Input: {input}
|
| 34 |
-
|
| 35 |
-
Response:"""
|
| 36 |
-
else:
|
| 37 |
-
return f"""User: hi
|
| 38 |
-
|
| 39 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 40 |
-
|
| 41 |
-
User: {instruction}
|
| 42 |
-
|
| 43 |
-
Assistant:"""
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
model = AutoModelForCausalLM.from_pretrained("RWKV/HF_v5-Eagle-7B", trust_remote_code=True).to(torch.float32)
|
| 47 |
-
tokenizer = AutoTokenizer.from_pretrained("RWKV/HF_v5-Eagle-7B", trust_remote_code=True)
|
| 48 |
-
|
| 49 |
-
text = "请介绍北京的旅游景点"
|
| 50 |
-
prompt = generate_prompt(text)
|
| 51 |
-
|
| 52 |
-
inputs = tokenizer(prompt, return_tensors="pt")
|
| 53 |
-
output = model.generate(inputs["input_ids"], max_new_tokens=333, do_sample=True, temperature=1.0, top_p=0.3, top_k=0, )
|
| 54 |
-
print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True))
|
| 55 |
-
```
|
| 56 |
-
|
| 57 |
-
output:
|
| 58 |
-
|
| 59 |
-
```shell
|
| 60 |
-
User: hi
|
| 61 |
-
|
| 62 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 63 |
-
|
| 64 |
-
User: 请介绍北京的旅游景点
|
| 65 |
-
|
| 66 |
-
Assistant: 北京是中国的首都,拥有众多的旅游景点,以下是其中一些著名的景点:
|
| 67 |
-
1. 故宫:位于北京市中心,是明清两代的皇宫,内有大量的文物和艺术品。
|
| 68 |
-
2. 天安门广场:是中国最著名的广场之一,是中国人民政治协商会议的旧址,也是中国人民政治协商会议的中心。
|
| 69 |
-
3. 颐和园:是中国古代皇家园林之一,有着悠久的历史和丰富的文化内涵。
|
| 70 |
-
4. 长城:是中国古代的一道长城,全长约万里,是中国最著名的旅游景点之一。
|
| 71 |
-
5. 北京大学:是中国著名的高等教育机构之一,有着悠久的历史和丰富的文化内涵。
|
| 72 |
-
6. 北京动物园:是中国最大的动物园之一,有着丰富的动物资源和丰富的文化内涵。
|
| 73 |
-
7. 故宫博物院:是中国最著名的博物馆之一,收藏了大量的文物和艺术品,是中国最重要的文化遗产之一。
|
| 74 |
-
8. 天坛:是中国古代皇家
|
| 75 |
```
|
| 76 |
-
|
| 77 |
-
|
| 78 |
-
|
| 79 |
-
|
| 80 |
-
|
| 81 |
-
|
| 82 |
-
|
| 83 |
-
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
| 88 |
-
|
| 89 |
-
|
| 90 |
-
|
| 91 |
-
|
| 92 |
-
|
| 93 |
-
|
| 94 |
-
|
| 95 |
-
|
| 96 |
-
|
| 97 |
-
|
| 98 |
-
|
| 99 |
-
|
| 100 |
-
|
| 101 |
-
|
| 102 |
-
|
| 103 |
-
|
| 104 |
-
|
| 105 |
-
|
| 106 |
-
|
| 107 |
-
|
| 108 |
-
|
| 109 |
-
|
| 110 |
-
print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True))
|
| 111 |
```
|
| 112 |
-
|
| 113 |
-
output:
|
| 114 |
-
|
| 115 |
-
```shell
|
| 116 |
-
User: hi
|
| 117 |
-
|
| 118 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 119 |
-
|
| 120 |
-
User: 介绍一下大熊猫
|
| 121 |
-
|
| 122 |
-
Assistant: 大熊猫是一种中国特有的哺乳动物,也是中国的国宝之一。它们的外貌特征是圆形的黑白相间的身体,有着黑色的毛发和白色的耳朵。大熊猫的食物主要是竹子,它们会在竹林中寻找竹子,并且会将竹子放在竹笼中进行储存。大熊猫的寿命约为20至30年,但由于栖息地的丧失和人类活动的
|
| 123 |
-
```
|
| 124 |
-
|
| 125 |
-
#### Batch Inference
|
| 126 |
-
|
| 127 |
-
```python
|
| 128 |
-
import torch
|
| 129 |
-
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 130 |
-
|
| 131 |
-
def generate_prompt(instruction, input=""):
|
| 132 |
-
instruction = instruction.strip().replace('\r\n', '\n').replace('\n\n', '\n')
|
| 133 |
-
input = input.strip().replace('\r\n', '\n').replace('\n\n', '\n')
|
| 134 |
-
if input:
|
| 135 |
-
return f"""Instruction: {instruction}
|
| 136 |
-
|
| 137 |
-
Input: {input}
|
| 138 |
-
|
| 139 |
-
Response:"""
|
| 140 |
-
else:
|
| 141 |
-
return f"""User: hi
|
| 142 |
-
|
| 143 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 144 |
-
|
| 145 |
-
User: {instruction}
|
| 146 |
-
|
| 147 |
-
Assistant:"""
|
| 148 |
-
|
| 149 |
-
model = AutoModelForCausalLM.from_pretrained("RWKV/HF_v5-Eagle-7B", trust_remote_code=True).to(torch.float32)
|
| 150 |
-
tokenizer = AutoTokenizer.from_pretrained("RWKV/HF_v5-Eagle-7B", trust_remote_code=True)
|
| 151 |
-
|
| 152 |
-
texts = ["请介绍北京的旅游景点", "介绍一下大熊猫", "乌兰察布"]
|
| 153 |
-
prompts = [generate_prompt(text) for text in texts]
|
| 154 |
-
|
| 155 |
-
inputs = tokenizer(prompts, return_tensors="pt", padding=True)
|
| 156 |
-
outputs = model.generate(inputs["input_ids"], max_new_tokens=128, do_sample=True, temperature=1.0, top_p=0.3, top_k=0, )
|
| 157 |
-
|
| 158 |
-
for output in outputs:
|
| 159 |
-
print(tokenizer.decode(output.tolist(), skip_special_tokens=True))
|
| 160 |
-
|
| 161 |
-
```
|
| 162 |
-
|
| 163 |
-
output:
|
| 164 |
-
|
| 165 |
-
```shell
|
| 166 |
-
User: hi
|
| 167 |
-
|
| 168 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 169 |
-
|
| 170 |
-
User: 请介绍北京的旅游景点
|
| 171 |
-
|
| 172 |
-
Assistant: 北京是中国的首都,拥有丰富的旅游资源和历史文化遗产。以下是一些北京的旅游景点:
|
| 173 |
-
1. 故宫:位于北京市中心,是明清两代的皇宫,是中国最大的古代宫殿建筑群之一。
|
| 174 |
-
2. 天安门广场:位于北京市中心,是中国最著名的城市广场之一,也是中国最大的城市广场。
|
| 175 |
-
3. 颐和
|
| 176 |
-
User: hi
|
| 177 |
-
|
| 178 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 179 |
-
|
| 180 |
-
User: 介绍一下大熊猫
|
| 181 |
-
|
| 182 |
-
Assistant: 大熊猫是一种生活在中国中部地区的哺乳动物,也是中国的国宝之一。它们的外貌特征是圆形的黑白相间的身体,有着黑色的毛发和圆圆的眼睛。大熊猫是一种濒危物种,目前只有在野外的几个保护区才能看到它们的身影。大熊猫的食物主要是竹子,它们会在竹子上寻找食物,并且可以通
|
| 183 |
-
User: hi
|
| 184 |
-
|
| 185 |
-
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
|
| 186 |
-
|
| 187 |
-
User: 乌兰察布
|
| 188 |
-
|
| 189 |
-
Assistant: 乌兰察布是中国新疆维吾尔自治区的一个县级市,位于新疆维吾尔自治区中部,是新疆的第二大城市。乌兰察布市是新疆的第一大城市,也是新疆的重要城市之一。乌兰察布市是新疆的经济中心,也是新疆的重要交通枢纽之一。乌兰察布市的人口约为2.5万人,其中汉族占绝大多数。乌
|
| 190 |
-
```
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
+
datasets:
|
| 4 |
+
- teknium/OpenHermes-2.5
|
| 5 |
+
language:
|
| 6 |
+
- en
|
| 7 |
---
|
| 8 |
+
This is a 3B parameter RWKV-v5 ("Eagle") RNN language model pretrained on multi-lingual data and fine-tuned on Nous Research's OpenHermes 2.5 chat dataset.
|
| 9 |
|
| 10 |
+
The chat format is identical to the default HuggingFace format, **EXCEPT** that we replace the `<|im_start|>` and `<|im_end|>` special tokens with the unused ASCII codes `"\x16"` and `"\x17"` respectively. This was done to ensure complete compatibility withe standard RWKV World tokenizer.
|
| 11 |
|
| 12 |
+
An example interaction:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
```
|
| 14 |
+
\x16user
|
| 15 |
+
Write a poem about the beauty of mathematics in the style of Emily Dickinson.\x17
|
| 16 |
+
\x16assistant
|
| 17 |
+
In the heart of Numbers, so precise,
|
| 18 |
+
A Beauty dwells - both cool and nice.
|
| 19 |
+
An elegance that sings in sums,
|
| 20 |
+
And dances to algebraic drums.
|
| 21 |
+
|
| 22 |
+
Addition's grace, subtraction's might,
|
| 23 |
+
Multiplication's sheer delight,
|
| 24 |
+
Division's firm yet gentle hand,
|
| 25 |
+
All part of this enchanted land.
|
| 26 |
+
|
| 27 |
+
Infinity, a lover bold,
|
| 28 |
+
To whom the stars their secrets told,
|
| 29 |
+
While Zero, cloaked in mystery,
|
| 30 |
+
Unlocks each mathematical decree.
|
| 31 |
+
|
| 32 |
+
Fractals bloom like flowers wild,
|
| 33 |
+
Geometry, Nature's favored child,
|
| 34 |
+
Calculus, with steady gaze,
|
| 35 |
+
Maps out life's intricate maze.
|
| 36 |
+
|
| 37 |
+
Pi, an endless serenade,
|
| 38 |
+
In circles' arms forever laid,
|
| 39 |
+
And prime numbers,
|
| 40 |
+
standing lone,
|
| 41 |
+
Make the universe their own.
|
| 42 |
+
|
| 43 |
+
Each equation, theorem scribed,
|
| 44 |
+
In
|
| 45 |
+
parchment old or computer imbibed,
|
| 46 |
+
Is but a whisper from the skies,
|
| 47 |
+
A cosmic truth in disguise.\x17
|
|
|
|
| 48 |
```
|
| 49 |
+
In the above, newlines are printed without escaping, while ASCII codes are escaped for readability.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|