The complexity of cancer care has significantly increased over the years. What were once considered single diseases are now divided into numerous subtypes, each requiring distinct treatment plans based on evolving clinical guidelines.
THE CHALLENGE
This has created a growing challenge for oncologists, who must manage a wide variety of cancer types while also keeping up with rapidly changing best practices.
Another large challenge in oncology today is the sheer volume and complexity of evolving clinical guidelines. National organizations such as the National Comprehensive Cancer Network, American Society of Clinical Oncology, and American Cancer Society regularly update their recommendations, sometimes hundreds of times per year, based on new clinical trial data, emerging therapies and evolving treatment paradigms.
These guidelines are not always standardized across organizations, and individual cancer centers often add their own layers of expertise, making it even more difficult for clinicians to track and apply the latest best practices consistently.
At the same time, access to specialized oncologists is becoming more difficult, said Dr. Travis Zack, assistant professor of medicine at the University of California at San Francisco.
“Many regions are facing shortages of oncology specialists, forcing general practitioners to take on more responsibility for initial cancer workups and treatment planning,” he explained. “However, GPs often lack the time or specialized training to stay fully updated on the latest oncology guidelines, which can lead to inconsistencies in care and delays in treatment.
“There’s also the fundamental challenge of unstructured patient data and the time it takes to aggregate and review that information, in accordance with updated treatment guidelines, in order to make the best possible recommendations for the patient,” he continued.
Recognizing these challenges, the University of California at San Francisco looked to develop AI technology that could automate the process of aggregating, structuring and applying the latest clinical guidelines for oncologists, along with all of the information on the patients.
“The goal was to create a decision support system that could seamlessly integrate national guidelines and patient data with local institutional best practices, ensuring every patient received the most up-to-date, evidence-based care possible – without adding additional cognitive burden to already overworked clinicians,” Zak noted.
“This fundamental challenge – ensuring oncologists had quick, reliable access to up-to-date, evidence-based recommendations while optimizing physician time – led us to explore AI-driven systems that could make world-class oncology expertise more accessible, efficient and scalable across all care settings,” he added.
PROPOSAL
The AI system would combine a large language model, informed by all of the applicable national and local institutional guidelines, with transparent logic so clinicians could see precisely how and why the AI was making its recommendations.
The goal was to ensure every oncology consultation began with a complete, structured and up-to-date dataset, reducing information gaps and optimizing physician time to complete patient workups.
To achieve this, Zack explained the AI was designed with two core functions:
-
Aggregating and structuring clinical data – The system pulls and organizes relevant patient information from electronic health records to create a comprehensive view of the patient’s condition. If critical data – such as biopsy results, molecular testing or staging scans – is missing, the AI flags it before the oncology consultation to prevent unnecessary delays.
-
Integrating national and local clinical guidelines – The AI incorporates both standard guidelines (from sources like NCCN, ACS and ASCO) and institution-specific protocols, ensuring physicians are presented with the most relevant, up-to-date treatment recommendations tailored to the patient’s specific case.
“For example, if a patient is referred for suspected lung cancer, the system can automatically assess whether all necessary diagnostic steps have been taken,” Zack explained. “If a key test is missing, it prompts the referring physician to order it before the patient’s oncology visit. During the consultation, the AI then provides an evidence-based framework for decision making, reducing the cognitive burden on the physician while ensuring adherence to best practices.
“The overarching goal was not to replace human judgment but to enhance it – allowing oncologists to focus on personalized treatment decisions rather than spending valuable time retrieving and verifying information,” he added.
MEETING THE CHALLENGE
The AI technology was deployed in oncology workflows to support both general practitioners and oncologists, ensuring each step in the patient journey was guided by comprehensive, evidence-based insights.
For the study UCSF published, health IT and clinical services company Color clinicians analyzed 100 de-identified patient cases provided by UCSF – 50 for breast cancer and 50 for colon cancer. Each case included two sets of records: diagnosis records, containing all available information up to and including the date of diagnosis, and treatment records, encompassing all records up to, but not including, the date of treatment, was initiated.
To evaluate the AI, Color clinicians processed these cases in two phases:
-
Diagnosis run type: 100 patient cases (50 breast, 50 colon) using only records available up to the date of diagnosis.
-
Treatment run type: 100 patient cases (50 breast, 50 colon) with records included up to, but not beyond, the treatment initiation date.
“A primary care physician at Color reviewed the AI-generated output and made adjustments where necessary,” Zack said. “The system’s performance was assessed by tracking the number of modifications made in three key areas: accuracy of extracted decision factors, relevance of recommended workups to the patient’s condition and completeness of relevant workups. Additionally, the study recorded the time required for the clinician to finalize each workup plan using the AI.
“The AI system was integrated with electronic health records and other medical databases to streamline access to and interpretation of patient information,” he continued. “Patient data was de-identified to protect confidentiality. The system also was integrated into various technical flows to understand and evaluate all of the updated clinical guidelines for breast and colon cancer types.”
So how did it work in practice? Like this:
-
Data aggregation and structuring. Before an oncology consultation, the AI automatically compiled all relevant clinical information from the patient’s records and identified missing diagnostic steps.
-
Guideline-based recommendations. At the point of care, the system provided tailored recommendations based on national guidelines and institution-specific policies.
-
Continuous learning and updates. The AI dynamically incorporated the latest clinical research and guideline updates, ensuring physicians always worked with the most current evidence.
“By reducing time spent on administrative tasks and eliminating inconsistencies in care, the AI allowed oncologists to focus on patient interactions and treatment planning, with the intention of faster and more effective cancer care,” Zack said.
RESULTS
The implementation of AI in oncology workflows has led to measurable improvements in efficiency and decision making. One of the most notable outcomes has been a significant reduction in the time oncologists spend reviewing patient records and clinical guidelines prior to making treatment decisions.
“Previously, this process could take one to two hours, particularly for complex cases requiring a review of extensive medical history and evolving guideline recommendations,” Zack explained. “With the AI system in place, this time has been reduced to approximately 10 to 15 minutes in most cases. By automating data aggregation and structuring relevant clinical information, the system enables oncologists to focus on decision making rather than manual data retrieval.
“Another key finding has been the high level of alignment between AI-generated recommendations and those made by oncologists,” he continued. “In a comparative study, there was a 95% concordance between the AI’s treatment recommendations and clinical decisions made by oncologists based on standard guidelines.”
This suggests the AI system is effectively synthesizing and applying national and institutional guidelines in a way that supports clinical decision making, he added. While human oversight remains essential, this level of agreement indicates the AI can serve as a reliable tool for reinforcing evidence-based care, he said.
“Additionally, the system has contributed to improvements in the timeliness of treatment initiation,” Zack reported. “Delays in ordering essential diagnostic tests – such as biopsies or genomic testing – can extend the time between diagnosis and treatment, sometimes by weeks or months.
“By identifying missing but necessary workups earlier in the process, the AI system has helped reduce these delays, ensuring that patients progress to treatment in a timelier manner,” he continued. “Given that early intervention is critical in oncology, this reduction in delays represents an important improvement in patient care.”
Overall, these results suggest AI can play a meaningful role in improving efficiency, standardization and timeliness in oncology care, particularly in settings where access to specialized expertise may be limited, he added.
ADVICE FOR OTHERS
For healthcare organizations looking to integrate AI into oncology or other specialties, a strategic and structured approach to implementation is essential, Zack advised.
“One of the primary considerations is ensuring the AI system has access to comprehensive and accurate patient data,” he said. “AI-driven decision support tools rely on a full dataset to generate clinically meaningful recommendations.
“However, interoperability challenges between electronic health records and other data sources can result in incomplete clinical pictures, which may affect the reliability of AI outputs,” he continued. “Addressing these gaps through effective data integration and standardization should be a priority before implementation.”
Another important factor is the balance between AI-driven recommendations and clinical judgment, he noted.
“AI should be viewed as a tool to support, rather than replace, oncologists and other healthcare providers,” he stressed. “Organizations should ensure clinicians remain actively engaged in interpreting AI-generated insights and are able to override or modify recommendations when necessary.
“To facilitate this, AI systems should provide transparent and explainable decision pathways, allowing users to understand how recommendations were generated,” he concluded. “Clear visibility into the underlying logic builds trust in AI-assisted decision making and promotes adoption among clinicians.”
Follow Bill’s HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.
WATCH NOW: Seattle Children’s Chief AI Officer talks better outcomes through the technology