--- pretty_name: Revvity-25 license: cc-by-nc-4.0 modalities: - image task_categories: - image-segmentation - object-detection task_ids: - instance-segmentation annotations_creators: - expert-generated size_categories: - n<1K language: - en tags: - image - json - computer-vision - transformers - instance-segmentation - object-detection - coco - unet - microscopy - cells - biomedical - cell-segmentation - biomedical-imaging - microscopy-images - brightfield - cancer-cells - semantic-segmentation - datasets - fiftyone - cell - amodal ---

Revvity-25 (CVPRW 2025)

[![Paper](https://img.shields.io/badge/Paper-CVPRW%202025-brightgreen)](https://www.arxiv.org/abs/2508.01928)   [![GitHub](https://img.shields.io/badge/GitHub-IAUNet-black?logo=github)](https://github.com/SlavkoPrytula/IAUNet)  [![GitHub Stars](https://img.shields.io/github/stars/SlavkoPrytula/IAUNet?style=social)](https://github.com/SlavkoPrytula/IAUNet)   [![Project WebPage](https://img.shields.io/badge/Project-webpage-%23fc4d5d)](https://slavkoprytula.github.io/IAUNet/)
Yaroslav Prytula1,2  |  Illia Tsiporenko1  |  Ali Zeynalli1  |  Dmytro Fishman1,3
1Institute of Computer Science, University of Tartu,
2Ukrainian Catholic University, 3STACC OÜ, Tartu, Estonia
Revvity-25 preview
πŸ”₯ Paper: [https://arxiv.org/abs/2508.01928](https://arxiv.org/abs/2508.01928) \ ⭐️ Github: [https://github.com/SlavkoPrytula/IAUNet](https://github.com/SlavkoPrytula/IAUNet) \ 🌐 Project page: [https://slavkoprytula.github.io/IAUNet/](https://slavkoprytula.github.io/IAUNet/) We present the **Revvity-25 Full Cell Segmentation Dataset**, a novel 2025 benchmark designed to advance cell segmentation research. One of our key contributions in the paper **[IAUNet: Instance-Aware U-Net](https://www.arxiv.org/abs/2508.01928)** is a novel cell instance segmentation dataset named `Revvity-25`. It includes `110` high-resolution **`1080 x 1080` brightfield images**, each containing, on average, `27` manually labeled and expert-validated cancer cells, totaling `2937` annotated cells. To our knowledge, this is the first dataset with accurate and detailed annotations for cell borders and overlaps, with each cell annotated using an average of `60` polygon points, reaching up to `400` points for more complex structures. `Revvity-25` dataset provides a unique resource that opens new possibilities for testing and benchmarking models for modal and amodal semantic and instance segmentation. * You can also check out and download the dataset from our webpage: [Revvity-25](https://bcv.cs.ut.ee/datasets/) ## Directory structure ``` Revvity-25/ β”œβ”€β”€ images/ └── annotations/ β”œβ”€β”€ train.json └── valid.json ```` --- ## Citing Revvity-25 If you use this work in your research, please cite: ```bibtex @InProceedings{Prytula_2025_CVPR, author = {Prytula, Yaroslav and Tsiporenko, Illia and Zeynalli, Ali and Fishman, Dmytro}, title = {IAUNet: Instance-Aware U-Net}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR) Workshops}, month = {June}, year = {2025}, pages = {4739--4748} } ```` --- ## License [![License: CC BY-NC 4.0](https://img.shields.io/badge/License-CC%20BY--NC%204.0-lightgrey.svg)](https://creativecommons.org/licenses/by-nc/4.0/) This project is licensed under the **Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)**. You are free to share and adapt the work **for non-commercial purposes** as long as you give appropriate credit. For more details, see the [LICENSE](LICENSE) file or visit [Creative Commons](https://creativecommons.org/licenses/by-nc/4.0/). --- ## Contact πŸ“§ [s.prytula@ucu.edu.ua](mailto:s.prytula@ucu.edu.ua) or [yaroslav.prytula@ut.ee](mailto:yaroslav.prytula@ut.ee) --- ## Acknowledgements This work was supported by [Revvity](https://www.revvity.com/) and funded by the TEM-TA101 grant β€œArtificial Intelligence for Smart Automation.” Computational resources were provided by the High-Performance Computing Cluster at the University of Tartu πŸ‡ͺπŸ‡ͺ. We thank the [Biomedical Computer Vision Lab](https://bcv.cs.ut.ee/) for their invaluable support. We express gratitude to the Armed Forces of Ukraine πŸ‡ΊπŸ‡¦ and the bravery of the Ukrainian people for enabling a secure working environment, without which this work would not have been possible.