Follow us on X
Follow us on Linkedin
Real-world-data enabled assessment
for health regulatory decision-making

REALM on the AI act

After a long and turbulent legislative process, the AI Act is about to be passed. On 8 December 2023, the EU institutions reached a final compromise on the text of this groundbreaking legislation. The AI Act is a risk-based regulation that distinguishes four types of AI technologies based on the level of risk they may pose to society.

AI%20act

Some AI applications will be prohibited as they may pose unacceptable risks to societal values (e.g. social scoring, real-time facial recognition). The limited number of AI technologies will be considered high-risk. Those that may have a detrimental impact on human safety and fundamental rights. They will be subject to a conformity assessment before being placed on the EU market.

For certain AI applications that may pose a risk related to transparency, there will be requirements that disclose the fact that the user is dealing with a machine and not a human. The majority of AI applications will be categorised as minimal risk. Their development and use will be subject to existing legislation. Their providers will be encouraged to adopt voluntary codes of conduct that regulate certain issues related to their use in the market.

The AI Act also addresses the issue of AI for general purposes by imposing some transparency obligations on providers. In addition, general-purpose AI models that have been trained using particularly high computing power will be subject to risk assessment and mitigation requirements due to the systemic risks they may pose.

As the AI Act is a comprehensive regulation, it will also apply to the healthcare sector. Medical devices using AI technologies in principle will be categorised as high-risk. The AI Act and the MDR/IVDR will be applied to the medical technology sector. The AI Act regulations will include a value-based and ethical approach to AI applications. These regulations will impose regulatory requirements for AI-based medical devices.

Compliance with the requirements of the AI Act will be assessed as part of the conformity assessment under the MDR. It means that the unified conformity assessment process will be implemented, integrating the requirements of both the MDR and the AI Act and consolidating the certification process.

For REALM, the adoption of the AI Act is a big, yet expected challenge. All the AI Act regulatory requirements, concerning standards of safety, accuracy, reliability, transparency, and ethical AI will be taken into account in project outcomes and deliverables.