GGUF (f16/Q4_K_M) version now available on Ollama / Thanks for the model!

#1
by doitmagic - opened

Hi Andrei,

Thank you very much for this excellent Relation Extraction model for Romanian. It is an impressive and much-needed contribution to the Romanian NLP space!

To make it more accessible for the community using Ollama, I have performed a high-precision GGUF conversion (F16) and pushed it here:
https://ollama.com/doitmagic/qwen3-ro-rel-extract

I hope this helps others who want to run your fine-tuned model locally with ease. Keep up the great work!

Best regards,
Razvan (doitmagic)

Sign up or log in to comment