Medicine
Digitale Kompetenz
Digitale Kompetenz
ID wann-ist-eine-ki-anwendung-ein-medizinprodukt
Chapter 3.5
Medical Device & AI
Medical devices
The classification of AI applications in healthcare as medical devices depends on their specific intended use. While generic language models like ChatGPT are not considered medical devices, specialized applications developed for medical decision-making can be classified as such. Careful regulatory consideration is necessary to ensure the safety and quality of medical care.
Written by: Frank Stratmann
wann-ist-eine-ki-anwendung-ein-medizinprodukt
Update from Jun 25, 2025
Language Models in Healthcare: When is an AI Application Considered a Medical Device?
This article is based on the assessment of whether a generator for medical reports that uses a language model like GPTx or Llama 3 should be classified as a medical device. This article does not constitute legal advice. If you need such expertise, feel free to contact me.
After developing an AI-powered tool for the automated creation of medical reports from simple handwritten notes, regulatory questions arose during implementation. The application was developed by a chief radiologist in collaboration with the IT department to optimize the documentation process and improve the quality of medical reports. The central need was clearly to overcome the various language barriers and to translate fragmented notes into coherent official German medical language.
The inquiry → touches not only an important regulatory core issue of AI applications in healthcare. A seemingly generic technology like a language model suddenly becomes a medical device in specific application contexts. This distinction is not trivial and requires a differentiated analysis, which I will elaborate on below.
With this entry in our compendium, we embark on the challenge to strive for the cultural acceptance of hybrid media or those created in shared authorship between humans and machines in healthcare processes.
The Legal Definition of Medical Devices
According to the EU Medical Device Regulation (MDR), software is considered a medical device if it provides information that is used for “decisions for diagnostic or therapeutic purposes” or is used for “diagnosis, prevention, monitoring, prognosis, treatment or mitigation of disease” →. The determining factor here is the intended purpose by the manufacturer and the actual function of the software in a medical context.
Software as a Medical Device (SaMD)
“Software as a Medical Device” (SaMD) refers to standalone software, which is a medical device but not part of another medical device →. This definition is important for distinguishing between general applications and medical tools.
Why ChatGPT and Llama 3 are Not Medical Devices Themselves
To be classified as a medical device, an explicit medical purpose by the manufacturer is required. This is not the case with general language models, which are designed for a variety of applications →.
The cross-national working group on medical devices (AGMP) has taken a clear position on this.
ChatGPT is not a medical device.
The reasoning states that a digital language model, without a specific medical purpose, merely “assembles narratives based on probabilities without comprehending them,” which does not lead to the applicability of the MDR →. This view also corresponds to the legal opinion of the European Court of Justice →.
The manufacturers of these generic AI models do not claim that their products were developed for medical purposes. →
No such statements can be found on the OpenAI website or elsewhere from the manufacturer. There, only the statement can be found that ChatGPT is designed to generate new texts from various available online text modules.
Why a Medical Report Application Can Be a Medical Device
It is different with specialized applications that are built on these base models but explicitly developed for medical purposes. The medical report generator described in the inquiry meets several criteria that are relevant for classification as a medical device:
Specific Medical Purpose: The application is specifically designed for creating medical reports that contain medical information and are used for further patient treatment →.
Support for Clinical Decisions: Medical reports serve the transmission of examination results, diagnoses, and therapy recommendations, which are used for diagnostic and therapeutic decisions →.
Professional Medical Context: The application is not used by patients for self-research but by medical professionals in clinical everyday life.
From a Generic Model to a Specific Medical Device
The decisive difference lies in the context and intended purpose.
There is a difference between using a language model to make a professional assessment, which can fundamentally also be critical, or the user consulting ChatGPT themselves and posing their question in self-responsibility.
AI systems that are used in healthcare and are relevant for medical decisions are classified as high-risk AI systems according to the EU AI Regulation →. Medical devices with AI functionality regularly fall into this category if they require conformity assessment by a third party starting from MDR risk class IIa.
Legal and Practical Consequences for the Medical Report Tool
If the developed tool is classified as a medical device, specific regulatory requirements arise:
Conformity Assessment Procedure: Depending on the risk class, an appropriate procedure must be carried out.
Quality Management System: The MDR requires manufacturers to establish a risk and quality management system →.
Technical Documentation: Comprehensive documentation of the development, functionality, and risk assessment is required.
Clinical Evaluation: Proof of clinical performance and safety of the product.
Recommendations
The question of whether an AI application in the medical field constitutes a medical device critically depends on its specific intended purpose and application context.
While generic language models like GPTx or Llama 3 are not medical devices themselves, specialized applications built on them for medical decision-making processes can indeed be classified as such.
For the described medical report generator, this means that careful regulatory consideration is necessary. The developers should:
Conduct a detailed risk analysis
Obtain regulatory expertise early
Examine the system for compliance with MDR requirements
Implement an appropriate quality management system
Language models and AI offer enormous potential in healthcare, such as in diagnosis and information provision →. However, balancing innovation and regulation remains a challenge. Ultimately, ensuring patient safety and the quality of healthcare remain paramount.
Literature Cited and Consulted
Attorney Demands Classification of ChatGPT as a Medical Device
Quick Check for Medical Devices Under the AI Regulation - Osborne Clarke
Electronic Medical Report (eMedReport) | opta data - Telematics Infrastructure
ChatGPT as a Medical Device According to EU Law? - bayoocare
ChatGPT is not a Medical Device | PZ - Pharmaceutical Journal
Software as a Medical Device SaMD: Definition and Classification
AI Act: Guide for Medical Device Manufacturers under MDR (2025)
AI-based Medical Devices: MDR vs AI Regulation | reuschlaw News
PDF Practice Information: Electronic Medical Report - Applications in the TI
ChatGPT is Not a Medical Device | PZ – Pharmaceutical Journal
Medical Device - Definition, Criteria for Classification, Risk Classes
ID wann-ist-eine-ki-anwendung-ein-medizinprodukt
Chapter 3.5
The links embedded in the text as numbered footnotes stand independently. The reference to the information can be found directly on the page to which the link leads. We have chosen this practice for the sake of the page's clarity. The numbers are not set in the usual order because the revision of the page continuously incorporates new sources.