
As artificial intelligence (AI) technology continues to evolve, it has the potential to transform the health care field—from streamlining workflows, reducing costs, and increasing efficiency to improving screening, refining diagnostic and treatment protocols, and achieving better outcomes.
“We must recognize that AI is here to stay,” explains Marc D. Succi, MD, associate chair of innovation and commercialization and co-director of the innovator growth division, both at Mass General Brigham, as well as program co-director of Harvard Medical School’s certificate program, Leading AI Innovation in Health Care. This means that health care leaders, visionaries, and those with leadership aspirations must stay on top of the technological advancements in the industry so they can leverage the benefits effectively.
And since the stakes are higher in health care than in most other industries when it comes to AI, he adds that leaders also need to be prepared to help their organizations navigate the evolving technology landscape in a more thoughtful and responsible way.
The Value of an AI Framework
One of the best ways that leaders can guide their organizations is by developing a framework to direct the responsible adoption of AI innovation, according to Roger Daglius Dias, MD, PhD, MBA, an associate professor of emergency medicine at Harvard Medical School, who serves as director of research and innovation for the STRATUS Center for Medical Simulation and director of the Medical AI and Cognitive Engineering (MAICE) Lab. Dias also joins Succi as program co-director of HMS’s Leading AI Innovation in Health Care. He says that such a framework must ensure that all AI systems meet regulatory compliance as well as patient privacy and safety requirements.
“Health care organizations must consider the risks and unintended consequences of AI when it comes to patient care and patient safety. Therefore, clinicians need high-reliability models,” he says. “They also need a seamless experience when interacting with AI systems so they can integrate the tools into their clinical workflow without disruption. Model fairness [perpetuating potential bias by training the system on data that doesn’t include representation of all ethnicities and socio-economic groups] is another concern.”
All these issues, if not properly addressed, can weaken trust in AI and hamper its adoption. In addition, human factors (creating new methods to optimize how clinicians and patients interact with AI systems) must also be considered.
The Need to Invest in AI
“Most health care leaders are not trained in the latest technologies. They don’t need to be high-tech, but they do need to be ready to invest in AI now,” Succi says. “This means investing in security systems that address HIPAA concerns to prevent exposing patient data to AI and ensuring that new endeavors are being integrated with the work your organization has done so far.”
While leaders certainly don’t have to manage the technical side of things, they do need to be able to hire the right experts and know how to talk in a common language to technologists, data scientists, consultants, and others involved in their AI systems so everyone is on the same page.
Leadership also needs to involve clinicians in their conversations since they will often be tasked with utilizing the tools. To this end, leaders should educate their workforce about the benefits of AI and get organization-wide buy-in. One way to garner such widespread support is by aligning new AI offerings with current organizational goals.
“You need to be very clear from the leadership point of view and provide training and education on the role of AI and its value and pitfalls,” says Dias, “and also set up the environment where users, clinicians, and patients can get onboard.”
A Role for Clinicians
Clinicians can play a pivotal part in supporting leadership’s effort and helping to drive AI adoption within their department and within the larger organization, stresses Samir Kendale, MD, FASA, who serves as medical director of anesthesia informatics at Beth Israel Lahey Health as well as faculty for HMS’s AI in Clinical Medicine live virtual CME course.
“To be most effective, clinicians should be able to work with both their institutional leadership and their IT departments. While it is important to have a basic understanding of AI literature and AI products to determine whether a particular solution is appropriate for a given problem, it is equally important to have a basic understanding of systems architecture,” he adds.
Another bonus for clinicians with leadership aspirations who take the time to learn about artificial intelligence is that this knowledge can help them advance along their career path.
Taking a Measured Approach to Integrate New Tools
While integrating AI systems throughout an entire organization can feel overwhelming, Succi stresses that it’s important to take a measured approach to introducing new tools in a hospital or other health care setting.
“If you have a new AI tool in your hospital, deploy it in a shadow mode first so it’s available to run in the background to collect data that allows you to assess its performance in the context of your patients,” he says. “You can also deploy the tool in focus groups in small steps with clinicians and nurses to get their feedback before pushing it across the organization.” By introducing new tools in this way, leaders can evaluate how they work before using them more broadly.
Kendale notes that clinicians can help test artificial intelligence models to ensure they are serving their intended role by asking some key questions. “As a clinician, do your best to understand what is going on under the hood. What is the training data? Does it represent your own data? What is the output? Is this the output you want? Is the output considered accurate for your needs? How are the data handled? Who has ownership of any AI-generated patient data? Ask lots of questions! The more the clinician understands, the more they can communicate trust,” he says.
Seek Lessons Learned from Others
When it comes to implementing an AI framework, the savviest leaders recognize that they don’t need to start from scratch. “Leaders can look for lessons learned from other organizations rather than reinventing the wheel. Really use your network and learn from other people’s successes and failures,” Dias says. He suggests that leaders look for high-impact use cases that can be applied to their own organization.
“For instance, at Mass General Brigham, the hospital takes a system approach for AI that is well structured, and they have established communication channels so that all of the units talk to each other,” he explains.
Preparing for the Future of AI
Those organizations that make the effort to implement effective AI systems will be well prepared as the future continues to unfold.
“AI is going to make us the most efficient version of ourselves, in terms of delivering health care more efficiently and managing the administrative side,” Succi says. AI can take on some of the time-consuming tasks involved in providing medical care so that clinicians can focus more time on connecting with their patients.
-
Resources
Dias, Roger Daglius, MD, PhD, MBA, director of research and innovation, STRATUS Center for Medical Simulation; director and lead investigator, the Medical AI and Cognitive Engineering (MAICE) Lab; associate professor of emergency medicine at Harvard Medical School; program co-director, Harvard Medical School’s Leading AI Innovation in Health Care.
https://postgraduateeducation.hms.harvard.edu/faculty-staff/roger-daglius-diasKendale, Samir, MD, FASA, medical director of anesthesia informatics, Beth Israel Lahey Health and faculty, Harvard Medical School’s AI in Clinical Medicine, email interview Jan. 2025.
https://cmecatalog.hms.harvard.edu/faculty-staff/samir-kendaleSucci, Marc D., MD, associate chair of innovation and commercialization, Mass General Brigham; co-director of innovator growth division, Mass General Brigham; and program co-director, Harvard Medical School’s Leading AI Innovation in Health Care, Zoom interview Jan. 2025.
https://postgraduateeducation.hms.harvard.edu/faculty-staff/marc-d-succi