security

CHIME23: Key Takeaways on AI and Security from Health IT Leaders – HealthTech Magazine


Shakeeb Akhter, senior vice president and chief digital and information officer at Children’s Hospital of Philadelphia, represented the provider perspective on the panel. He emphasized that his organization talks about “augmented intelligence” instead of artificial intelligence because it goes over better with clinicians.

“AI isn’t a replacement but a co-pilot that helps clinicians do things a little faster and easier while removing some of the nonclinical work nurses have to do, such as prior authorizations, faxing and searching,” he said. “How can we take some of that work off their plate? That’s 10 to 20 percent of a clinician’s day, every single day. Operational efficiency is No. 1.”

He noted that clinicians spend a lot of time searching for information such as details about medical procedures and protocol. Generative AI can help clinicians with document discovery, searching and synthesis in those cases.

Panelists said that they foresee generative AI having the biggest impact on customer service operations, rote documentation, hyperpersonalization and communication between providers and payers.

Children’s Hospital of Philadelphia decided not to begin its generative AI journey in the clinical space, focusing instead on operations and administrative staff.

“Don’t start in a high-risk area. We wanted to start, and most health systems want to start, with a low-risk, high-impact use case,” said Akhter, adding that the selection process should be no different than for any other technology. However, it’s likely that there will be special considerations during deployment that require experimentation.

Nole said that healthcare should stay away from using generative AI to make clinical diagnoses. Adoption of the technology is happening rapidly, and she said that she’s surprised at how organizations are using governance to address challenges. She explained that some are worried that automated clinical documentation may make physicians too likely to sign off on them automatically without checking. Options include monitoring how long someone stays in the note as a preventive measure.

Readers Also Like:  Top Google Lawyer Preps for Fights Over AI and Tech Censorship - Bloomberg Law

EXPLORE: Learn three keys to success with a generative AI platform.

To evaluate generative AI uses cases, Bathina suggested asking six questions:

  1. Does the use case tackle a prioritized pain for the health system?
  2. Is the use case worth it financially?
  3. What kind of training and resources are needed to drive adoption?
  4. Is the health system or payer ready to drive change management?
  5. Are regulations and compliance in place?
  6. Is it ethical, or is there bias?

No matter the use case, Akhter said, organizations must consider safety and cybersecurity factors in addition to governance. An important aspect of governance is quality. He noted that the quality of algorithms degrades overtime, especially if not maintained. Organizations must consider the maintenance cycle, check for bias and not trust outputs blindly.

“In healthcare in particular, you always need to have a human in the loop. Folks think it will be automated end-to-end, but that’s likely not the case, especially when it comes to clinical care,” Akhter said. “We need the right guardrails.”

Training Is Needed to Support Generative AI Adoption in Healthcare

During the conference, HealthTech spoke with Dr. Zafar Chaudry, senior vice president and chief digital and information officer at Seattle Children’s, about his perspective on generative AI. He said that healthcare leaders need to be careful about how it’s rolled out and plan education and training for their staffs. It’s important that users understand how to prompt the tool to get the answers that they need. Chaudry also emphasized the importance of guarding protected health information.

Readers Also Like:  TikTok Fined $370 Million Over Handling of Children's Data in Europe - Slashdot

The next step for an organization looking to implement generative AI is to find use cases that make sense for them. Seattle Children’s created a methodology to address AI use cases. First, AI training is mandatory for everyone. Employees must take a course once a year. The organization also implemented a new use case policy, which involved the creation of an AI policy committee made up of clinicians and nonclinicians that vets proposed use cases.

While Chaudry believes that generative AI can help create workflow efficiencies for clinicians and speed up research, he doesn’t think it will ever fully replace clinicians because “people need people.”

“Generative AI is a tool. Use it and apply your level of intelligence to it and it will speed things up for you, but taking it at face value is risky, especially if you’re doing a clinical care use case,” Chaudry said, adding that healthcare professionals have the expertise to apply context and common sense to generative AI tools for them to work properly.

One of the biggest challenges in healthcare today is the cost of delivering care, and Chaudry said that this is why healthcare organizations are looking at AI. He said that he sees a real opportunity for the technology to handle rote tasks so that clinicians and other healthcare employees can bring real value.

READ MORE: Discover important security considerations for embracing AI.

Healthcare Makes Cybersecurity Strides as New Threats Arise

Cybersecurity is a full-time initiative in healthcare. As threats increase, health IT leaders must constantly assess risks and vulnerabilities and identify secure solutions. The rise of ransomware attacks on healthcare is especially alarming. One health system had to close due, in part, to the financial strain of a ransomware attack.

Readers Also Like:  Dell launches new security offerings for data protection, MDR - TechTarget

The threat of ransomware and other cyberthreats have become so commonplace, UNC Health CISO Dee Young said, that they’ve become kitchen table topics. Health IT leaders no longer need to explain the risks to healthcare stakeholders; the focus is now on mitigating the risks.

“I’m hopeful. I feel like we’re making some good progress with regulation and legislation. However, the threats don’t stop,” she said. “I think most of healthcare is still trying to handle the day-to-day and the threats that we have now.”

In the longer term, Young said, she hopes that healthcare can make strides in medical device security thanks to recent regulations such as the Protecting and Transforming Cyber Health Care Act, which was signed into law at the end of 2022 as part of the 2023 Consolidated Appropriations Act.

The act requires that vendors of medical devices that connect to the internet and could be vulnerable to cyberthreats monitor the devices for cybersecurity vulnerabilities, develop processes to keep the devices secure, make patches available, and comply with U.S. Food and Drug Administration requirements and regulations.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.