-
-
-
-
-
-
Inference Providers
Active filters:
sea
Text Generation
•
7B
•
Updated
•
8.3k
•
68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated
•
8
•
3
LoneStriker/SeaLLM-7B-v2-GGUF
7B
•
Updated
•
122
•
6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
•
Updated
•
1
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
•
Updated
•
2
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
•
Updated
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
•
Updated
•
1
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
•
Updated
•
1
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
•
7B
•
Updated
•
1
Text Generation
•
8B
•
Updated
•
131
•
28
Text Generation
•
4B
•
Updated
•
80
•
6
Text Generation
•
2B
•
Updated
•
69
•
•
8
Text Generation
•
0.6B
•
Updated
•
107
•
9
Text Generation
•
8B
•
Updated
•
68
•
7
Text Generation
•
4B
•
Updated
•
39
•
2
Text Generation
•
2B
•
Updated
•
11
•
•
5
Text Generation
•
0.6B
•
Updated
•
53
•
7
sail/Sailor-1.8B-Chat-gguf
2B
•
Updated
•
268
•
3
sail/Sailor-0.5B-Chat-gguf
0.6B
•
Updated
•
289
•
4
4B
•
Updated
•
240
•
3
8B
•
Updated
•
229
•
5
Text Generation
•
9B
•
Updated
•
10.4k
•
50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B
•
Updated
•
45
•
8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
•
2B
•
Updated
•
5
•
2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B
•
Updated
•
6
•
1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
•
9B
•
Updated
•
85
•
1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
•
7B
•
Updated
•
82
•
1
Image-to-Text
•
8B
•
Updated
•
9
•
5
NghiemAbe/SeaLLM-7B-v2.5-AWQ
Text Generation
•
Updated