AI & the Future of Medical Writing

Perhaps the most talked about buzz words of 2025 so far include “artificial intelligence.” With the rise of ChatGPT, I’m often approached by non-medical writers about my opinion on how these AI tools will affect the future of medical writing as we know it. It’s not only a question I’ve gotten asked every year since becoming a medical writer almost a decade ago, but also a question I now get asked more and more every year.

Many of these inquirers are unaware that my first foray into medical writing was through a tiny, NYC-based artificial intelligence tech startup. Therefore, I have never been able to consider the field of medical writing without the context of an artificial intelligence lens. 

Trends Across Medical Device Regulations

Medical devices follow technology trends, which means several medical devices are now incorporating software. This may be in the form of diagnostic software, monitoring software, or therapeutic software, among others. These devices are required to meet specialized regulatory requirements, such as MDCG 2019-11 for the EU MDR and EU IVDR, and SaMD for the FDA. Some medical devices are now being 3D printed, which require other special regulations and guidelines.

Machine Learning & Drug Discovery

In the world of pharmacology, many manufacturers are looking to incorporate algorithms to more quickly identify new drug candidates, or repurpose existing ones. We’ve also seen cases where machine learning has been incorporated to produce more personalized treatments across individual patient cases. Our prediction for 2025 is an uptick in adaptive learning systems being incorporated into drug discovery and individualized patient therapy.  

Tools for Medical Writers

No, I don’t think AI will replace human writers in 2025, but I do think medical writers who refuse to incorporate AI technology into their repertoire will be outpaced by medical writers who do. After all, I’ve used reference software like Endnote and Zotero for well over a decade, and it makes me a more efficient writer that produces better-organized documentation and manuscripts, and a more organized update process to hand off to a client for documents that require regular updates. After all, our goal is for the client to eventually not need our services, and be self-sufficient in their regulation update process after we leave.

Besides reference software, I’ve also loved incorporating Distiller into my systematic literature review process to become more efficient and stay organized (from producing more accurate PRISMA tables to aiding in the organization of data extracted from peer-reviewed publications). Other generative AI chatbots, such as Co-pilot and ChatGPT, could be useful for automating data extraction–though more on this below.

Predicted Challenges for Incorporating AI & Medical Writing

However, being at the forefront of AI incorporation into medical writing, there will be many challenges ahead. The most glaring concern comes in the form of cybersecurity, across several fronts. From a product perspective, any time a device connects to the internet, this presents a new safety concern because the network is now penetrable, from an individual device operability standpoint, and also a data security and integrity standpoint. Hence, new regulations are required by the EU and FDA whenever a device incorporates software.

Speaking of data quality, almost any time data is submitted to a generative AI platform like Co-pilot or ChatGPT, most of these inputs are stored. That’s because machine learning uses this information as “training data” to learn better and better algorithms, with the end goal of getting “smarter” over time. However, it means if you submit confidential data to any of these platforms, it will store these data and potentially incorporate it into the answers of other users, breaking confidentiality and even compliance. This is why ChatGPT and/or Co-pilot should never be used with confidential data without a specific enterprise license. Some companies have also worked to produce bespoken AI tools to better automate their processes, while still keeping confidential information secure.

That being said, I’ve used several of these AI tools for documentation processes such as data extraction, such as extracting safety and performance outcomes from peer-reviewed publications, or creating tables from provided performance endpoints, and I’ve found that many of these tools miss incredibly important points required for successful evaluation, such as serious adverse events. Moreover, from a purely writing standpoint, a generative AI tool’s goal is to appear more human, but this often results in wordy, garden-path sentences that take too long to get to the point. The goal of successful documentation, whether it be a peer-reviewed publication, white paper, clinical evaluation, or clinical study report, is to use succinct language. This goes for both technical audiences like a notified body, and more general audiences like a white paper, or Summary of Safety and Performance.

Based on my decade-plus experience in research, medical writing, and artificial intelligence, it’s my personal and professional opinion that a lot of these platforms unfortunately just aren’t there yet, so when a client does provide AI-based tools, I still incorporate them into my process, but sparingly and cautiously, with lots of checks in balances from me, the human it’s ultimately learning from. Also, a lot of the platforms utilizing the AI buzzwords are really just incorporating automated products, not products that incorporate actual machine learning. This can create confusion among those of us with prior tech backgrounds. 

AI & Minola Scientific

Given our tech start, where does Minola Scientific stand in the world of AI?

At Minola Scientific, we’ve already worked with medical devices and other products that incorporate software, and are seasoned in both automated and AI tools within our writing processes. We hope that AI products can better streamline the more tedious elements of medical writing (e.g., data extraction and table building) to allow more time for the activities we love (e.g., reviewing data and exciting new literature, writing about benefit-risks). We’ll continue to utilize these automated and AI-based tools across our business in 2025. 

Dr. Brynne DiMenichi

Dr. Brynne is the owner and CEO at Minola Scientific. She received her PhD in Neuroscience from Rutgers University in 2018, and has over a decade of experience in clinical, academic, and non-profit industries. In 2019, she “officially” became a medical writer, and fell it love with the field ever since. In 2022, she decided to pursue consulting full-time in order to help teams needing support achieving their clinical research goals.

Next
Next

Navigating the Latest 2024 Updates to the EU Medical Device Regulation (MDR)