What legal issues should a provider of an AI system supporting the diagnosis of lung diseases address? What obligations arise for them from the GDPR, and which from the AI Act? How does the AI Act change the certification requirements for medical devices under the MDR and IVDR? These above questions will be addressed in this article.


Processing of Personal Data

According to the principle of data minimization, the scope of processed personal data should be limited to the necessary minimum. Therefore, where possible, anonymization or pseudonymization of data should be used for training AI models.

What if using large datasets containing personal health data is necessary for training the AI model? It is necessary to ensure adequate protection of the processed data, particularly as required by the GDPR. The following obligations must be remembered:

In the context of data processing and data governance, the AI Act requires that high-risk AI systems be developed using high-quality datasets for training, validation, and testing. These datasets should be appropriately managed, considering factors such as data collection processes, preparation, potential biases, and data gaps. Datasets should be as appropriate, representative, error-free, and complete as possible. The specific context in which the AI system will be used should also be considered. In some cases, providers may process special categories of personal data to detect and correct biases but must comply with strict conditions protecting individuals’ rights and freedoms.

The AI system provider should also consider data processing from the perspective of system users. This way, they can offer value in the form of safeguarding users’ compliance interests. For example, the provider may prepare a model DPIA that their clients can use when conducting their own assessments.


Compliance with AI Act Requirements for High-Risk Systems

Firstly, high-risk AI systems are those that are medical devices (according to the MDR) or in vitro diagnostic medical devices (according to the IVDR) subject to conformity assessment by a notified body.

Secondly, specific use cases may qualify as systems listed in Annex III to the AI Act. AI systems listed in Annex III are considered high-risk unless they do not pose significant risks to the health, safety, and rights of individuals. Providers who believe their AI system is not high-risk must document their assessment before selling or deploying it.

Examples:

High-risk AI system providers have a number of obligations, particularly:

1 Implement a quality management system compliant with the AI Act (Art. 17).

2 Maintain technical documentation and store it for 10 years (Art. 18).

3 Ensure transparency of the AI system, including explainability of decisions for end users (Art. 13).

4 Design the high-risk AI system to enable human oversight by users (Art. 14).

5 Meet requirements regarding cybersecurity, accuracy, and robustness of the system (Art. 15).

6 Register the AI system in the EU high-risk registry (Art. 49).

Although the AI Act provisions concerning high-risk AI systems that are medical devices subject to conformity assessment by notified bodies will come into force only on August 2, 2027[1], preparation to meet them is a longer process that must begin earlier.


Certification under the AI Act and MDR/IVDR

The AI Act requires that the provider of a high-risk AI system that is also a medical device follows the appropriate conformity assessment procedure required under the MDR or IVDR. This procedure should also include verification of the requirements for AI systems arising from the AI Act. In practice, this can be difficult because notified bodies must be authorized to assess conformity under both the MDR/IVDR and the AI Act[2].

Many of the requirements provided by the AI Act are already covered by conformity assessment procedures under the MDR and IVDR, which are carried out by notified bodies. The methodologies of the MDR and AI Act regarding the obligations that the medical device manufacturer or high-risk AI system provider must fulfill are similar—both regulations focus on similar stages of the product life cycle and regulate these obligations in similar ways.

The MDR and AI Act regulations will particularly overlap in the following key areas:


The AI Act adds another layer of requirements to the GDPR, MDR, and IVDR for medical devices, including software, that use AI. Ensuring compliance with regulatory obligations requires comprehensive legal analysis already at the stage of creating AI-based solutions and ongoing monitoring at later stages.


Contact

Michał Pietrzyk – radca prawny (Attorney-at-law) | Senior Associate in the Transactional Team, the German Desk Team, the IP/IT Team and the Competition and Consumer Protection Team.


[1] The second case described, namely AI systems listed in Annex III to the AI Act, will be subject to the provisions of the AI Act starting from 2 August 2026, when those provisions come into force.
[2] Article 43(3) of the AI Act: In the case of high-risk AI systems falling within the scope of Union harmonization legislation listed in Annex I, Section A, the provider shall follow the appropriate conformity assessment procedure required under those legal acts. For such high-risk AI systems, the requirements set out in Section 2 of this Chapter shall apply and form part of that assessment. The provisions of Annex VII, points 4.3, 4.4, 4.5, and the fifth subparagraph of point 4.6 shall also apply. For the purpose of that assessment, notified bodies designated in accordance with those legal acts shall be entitled to carry out conformity checks of high-risk AI systems with the requirements laid down in Section 2, provided that their compliance with the requirements laid down in Article 31(4), (5), (10), and (11) has been assessed in the context of the notification procedure provided for in those legal acts.