Skip to content

TRUST-RAD: New dataset for reliable AI assistance tools in radiology published

The TRUST-RAD project team has published a comprehensive dataset for the development of AI assistance systems for X-ray image interpretation. RadVLM Instruction Dataset is now available to the scientific community on the PhysioNet platform.

The dataset comprises over 1.1 million image-instruction pairs from chest X-rays and covers various radiological tasks: from diagnosis and detection of abnormalities to localization of anatomical structures. It also contains around 89,000 dialogue-based conversations that simulate realistic interactions between medical staff and AI systems.

The data was compiled from several publicly available medical image archives and prepared for training AI models. Of particular relevance here is the possibility of developing dialogue-capable assistance systems that can respond flexibly to different questions.

In addition to the dataset, the RadVLM Model is now also available via PhysioNet.

Furthermore, the TRUST-RAD team has secured a 500,000 GPUh compute grant through the Swiss AI Initiative for “3D Vision Language Model For Radiology”

TRUST-RAD is funded in the third Rapid Action Call. The project aims to develop AI tools for radiology that are not only powerful but also reliable and transparent in their functioning.