The IT5 models are a family of encoder-decoder models based on the T5 model architecture and pre-trained on a large Italian web corpus. They were introduced in the paper IT5: Text-to-text Pretraining for Italian Language Understanding and Generation by Gabriele Sarti and Malvina Nissim.In this demo you can evaluate the whole set of fine-tuned IT5 models and their multilingual counterparts on many sequence-to-sequence tasks for the Italian language (see examples below).๐ Paper: https://aclanthology.org/2024.lrec-main.823/๐ป Code & Data: https://github.com/gsarti/it5๐ค Checkpoints: https://huggingface.co/collections/gsarti/it5-lrec-coling-2024-6600468041d8fee2c42021c8