Medizin
Digital Literacy
Digitale Kompetenz
ID BTBLGR-CMP-22
Chapter 3.5
Language Models in Healthcare: When is an AI Application a Medical Device?
Medizinprodukte
A doctor's letter generator that uses AI models could be considered a medical device if it is specifically developed for medical purposes. However, generic language models like ChatGPT are not medical devices, as they lack a medical purpose. A careful regulatory examination is necessary for specialized applications to ensure the safety and quality of medical care.
Written by: Frank Stratmann
BTBLGR-CMP-22
Update from Apr 14, 2025
The article is guided by the assessment of whether a doctor's letter generator that uses a language model like GPTx or Llama 3 is to be classified as a medical device. This article does not constitute legal advice. If you need the relevant expertise, feel free to write to me.
After developing an AI-based tool for the automated creation of doctor's letters from simple handwritten notes, regulatory questions arose during implementation. The application was developed by a chief radiologist in collaboration with the IT department to optimize the documentation process and improve the quality of doctor's letters. The central need was clearly to overcome the various language barriers and transform doctor's letters based on fragmented notes into coherent official German.
The inquiry → touches not only on an important regulatory core issue in AI applications in healthcare. A generic technology like a language model suddenly becomes a medical product in specific application contexts. This distinction is not trivial and requires a nuanced view, which I would like to outline below.
With this entry in our compendium, we are on the trail of addressing the cultural acceptance of media created through hybrid or shared authorship between humans and machines in healthcare events.
The legal definition of medical devices
According to the EU Medical Device Regulation (MDR), software is considered a medical device if it provides information used for »decisions for diagnostic or therapeutic purposes« or if it is used for »diagnosis, prevention, monitoring, prediction, prognosis, treatment, or alleviation of diseases« →. The decisive factor here is the manufacturer's intended purpose and the actual function of the software in the medical context.
Software as a Medical Device (SaMD)
»Software as a Medical Device« (SaMD) refers to standalone software that is a medical device but not part of another medical product →. This definition is important for distinguishing between general applications and medical tools.
Why ChatGPT and Llama 3 themselves are not medical devices
For classification as a medical device, an explicit medical intended purpose by the manufacturer is required. This is not the case for general language models designed for a variety of applications →.
The cross-border Medical Device Working Group (AGMP) has made this clear.
ChatGPT is not a medical device.
The explanation states that a digital language model without a specific medical intended purpose merely »assembles stories based on probabilities without understanding them content-wise«, which does not lead to the applicability of the MDR →. This view also aligns with the legal opinion of the European Court of Justice →.
The manufacturers of these generic AI models do not indicate that their products were developed for medical purposes. →
Neither on the OpenAI website nor elsewhere can such statements from the manufacturer be found. There, one can only find the statement that ChatGPT was designed to generate new texts from various text modules available on the internet.
Why a doctor's letter application can be a medical device
It is different for specialized applications that build on these base models but are explicitly developed for medical purposes. The doctor's letter generator described in the inquiry fulfills several criteria relevant for classification as a medical device:
Specific medical intended purpose: The application is specifically designed for creating doctor's letters that contain medical information and are used for further patient treatment →.
Support for clinical decisions: Doctor's letters serve the transmission of examination results, diagnoses, and therapy recommendations, which are used for diagnostic and therapeutic decisions →.
Professional medical context: The application is used not by patients for self-research but by medical professionals in clinical daily routine.
From generic model to specific medical device
The critical difference lies in the context and the intended purpose.
There is a difference in whether a language model is used to raise a professional opinion, which can also critically be raised in principle, or whether the user queries ChatGPT themselves and asks their question there at their own responsibility.
AI systems used in healthcare and relevant for medical decisions are even classified as high-risk AI systems according to the EU AI Regulation →. Medical devices with AI functionality regularly fall under this category if they require a conformity assessment by third parties starting from MDR risk class IIa.
Legal and practical consequences for the doctor's letter tool
If the developed tool is to be classified as a medical device, specific regulatory requirements arise:
Conformity assessment procedure: Depending on the risk class, an appropriate procedure must be performed.
Quality management system: The MDR requires manufacturers to establish a risk and quality management system →.
Technical documentation: Comprehensive documentation of development, functionality, and risk assessment is required.
Clinical evaluation: Proof of the clinical performance and safety of the product.
Recommendations
Whether an AI application in the medical field constitutes a medical device depends crucially on its specific intended purpose and application context.
While generic language models such as GPTx or Llama 3 are not medical devices themselves, specialized applications based on them that are designed for medical decision-making processes can indeed be classified as such.
For the described doctor's letter generator, this means that careful regulatory consideration is necessary. Developers should:
Conduct a detailed risk analysis
Seek regulatory expertise early on
Examine the system for compliance with MDR requirements
Implement an appropriate quality management system
Language models and AI offer enormous potential for healthcare, particularly in diagnosis and information delivery →. Balancing innovation and regulation remains a challenge. Ultimately, it is crucial that the safety of patients and the quality of medical care are maintained.
Literature cited and consulted
Lawyer calls for classification of ChatGPT as a medical device
Quick Check for medical devices under the AI Regulation - Osborne Clarke
Electronic doctor's letter (eArztbrief) | opta data - Telematics infrastructure
ChatGPT is not a medical device | PZ - Pharmaceutical Journal
Software as a Medical Device SaMD: Definition and Classification
The electronic doctor's letter | All information for practices - DIP - medatixx
AI Act: Guide for Medical Device Manufacturers according to MDR (2025)
AI-based medical devices: MDR versus AI Regulation | reuschlaw News
PDF Practice Information: Electronic doctor's letter - Applications in TI
ChatGPT is not a medical device | PZ – Pharmaceutical Journal
Medical device - Definition, classification criteria, risk classes
The upcoming development of large language models in medicine
ChatGPT is a medical device – Open letter to supervisory authorities
ID BTBLGR-CMP-22
Chapter 3.5
The links embedded in the text as numbered footnotes stand independently. The reference to the information can be found directly on the page to which the link leads. We have chosen this practice for the sake of the page's clarity. The numbers are not set in the usual order because the revision of the page continuously incorporates new sources.
Digital Literacy