Imaging AI in Practice: A Demonstration of Future Workflow Using Integration Standards
Abstract
Artificial intelligence (AI) tools are rapidly being developed for radiology and other clinical areas. These tools have the potential to dramatically change clinical practice; however, for these tools to be usable and function as intended, they must be integrated into existing radiology systems. In a collaborative effort between the Radiological Society of North America, radiologists, and imaging-focused vendors, the Imaging AI in Practice (IAIP) demonstrations were developed to show how AI tools can generate, consume, and present results throughout the radiology workflow in a simulated clinical environment. The IAIP demonstrations highlight the critical importance of semantic and interoperability standards, as well as orchestration profiles for successful clinical integration of radiology AI tools.
Keywords: Computer Applications-General (Informatics), Technology Assessment
© RSNA, 2021
Summary
A demonstration successfully embedded artificial intelligence (AI) tools at different points throughout the radiology workflow in a simulated clinical environment using semantic and interoperability standards. These standards should be required in all requests for proposals and contracts with radiology systems and AI application vendors.
Key Points
■ A collaborative effort between the Radiological Society of North America, radiologists, and imaging-focused vendors resulted in the successful demonstration of integrating artificial intelligence (AI) tools into the radiology workflow in a simulated clinical environment.
■ Interoperability standards and orchestration profiles, including Digital Imaging and Communications in Medicine, Health Level Seven version 2, Fast Healthcare Interoperability Resources, Integrating the Healthcare Enterprise, Standardized Operational Log of Events, and common data elements, are necessary for successful integration of AI tools into the radiology workflow.
■ Interoperability standards should be required in all requests for proposals and contracts with vendors for radiology systems and AI tools.
Motivation for Demonstration
Artificial intelligence (AI) holds the promise of improving efficiency and quality of care in diagnostic radiology (1). AI tools are being developed to aid diagnosis and enhance processes at multiple points in the radiology workflow (2). To function in the clinical arena, AI applications must integrate with existing radiology systems (3). Standards-based application programming interfaces (APIs) between systems are required to enable seamless, in-context communication of data and tasks. Defining these APIs is best accomplished as a collaborative, iterative process. Modern development methods, which are based on defining use cases and rapidly deploying and testing prototype applications, can enhance and accelerate the development of API standards specifications.
The Radiological Society of North America (RSNA) Imaging AI in Practice (IAIP) demonstration was designed to foster collaboration between developers of radiology systems, including AI applications. RSNA coordinated between radiologists who defined use cases to be addressed and vendors of radiology systems and AI tools to showcase their ability to generate, consume, and present AI results effectively within the clinical workflow in a simulated clinical environment. This demonstration provided a deadline and a target outcome to drive needed multidisciplinary effort. In addition to collaborative engineering, adoption of AI in radiology will require appropriate dissemination of knowledge to radiologists, other clinical professionals, and patients—all of whom will need to learn about the benefits and potential pitfalls of AI technology, as well as the ways in which it will impact clinical practice.
The purpose of the IAIP demonstration was not to validate the efficacy of the AI tools presented, but to show how such tools can be integrated into the radiology workflow in a simulated clinical environment. These demonstrations serve as an educational instrument to raise awareness in the radiology community of emerging uses of AI throughout the imaging chain. The demonstrations highlight the integration standards required to embed applications into clinical systems, encourage industry collaboration and the use of interface standards, and give a glimpse into the future of radiology.
Clinical Scenario
An example clinical scenario was designed to highlight the ways in which AI can impact multiple steps along the imaging life cycle. In this scenario, a patient with symptoms of acute stroke presents to the emergency department, initiating a chain of events demonstrating points of integration for AI: (a) protocoling the ordered head and neurovascular CT imaging, (b) clinical decision support for detection of urgent findings, (c) worklist priority adjustment via AI results, and (d) reducing turnaround time through worklist prioritization and semiautomated structured reporting. At CT, pulmonary and thyroid nodules are detected, highlighting AI detection of incidental findings with insertion of appropriate recommendations within reports, facilitating appropriate follow-up and patient tracking. The story goes beyond acute clinical care, demonstrating how AI can assist in the consent process and in capturing data for future research. As imaging had to be stitched together from multiple patients to create this scenario, we named our patient “Frank N. Study.”
Demonstration Team
The RSNA Radiology Informatics Committee (RIC) formulated the IAIP demonstration premise in October 2018. A project manager (T.M.S.S.) who oversaw the development and execution of the demonstration was selected in January 2020. This individual recruited 16 vendors, each of whom paid a participation fee. Vendors committed to attending meetings and to preparing and integrating products for the final demonstration. Vendors were split into three teams. In August 2020, eight attending radiologists and radiology trainees were named as volunteer clinical champions.
The interplay between the RSNA RIC, project manager, vendors, and clinical champions was truly synergistic. Vendors shared their domain knowledge about particular products and applications, including many applications not yet available on the market, while they benefitted from the expertise and insight of the clinical champions who provided feedback that motivated product enhancements and improved implementation design. All became well versed in the standards necessary for clinical integration.
Demonstrated Integration Standards
For clinical utility and reduction in cost and implementation time, the IAIP goals included both semantic and interoperability standards. The specific standards used are listed in Tables 1 and 2 (3–6). Workflow diagrams in Figures 1 and 2 walk through the standards used in our clinical scenario. A requisition created in the electronic health record (EHR) for acute stroke imaging is passed to the radiology information system via Fast Healthcare Interoperability Resources (FHIR). The order is then processed by the Integrating the Healthcare Enterprise (IHE) Order Filler profile and passed to the imaging modality via the Digital Imaging and Communications in Medicine (DICOM) Modality Worklist. Images are acquired and transmitted with associated metadata via DICOM to the clinical image viewer. A report is generated in the reporting system, then passed back to the EHR via FHIR. AI results are integrated as images and textual annotations into the image viewer via DICOM and as structured text data into the structured radiology report as common data elements (CDEs). Thus, the integration standards demonstrated work on a variety of pixel and nonpixel data, including structured data from the EHR and DICOM metadata.
![]() |
![]() |

Figure 1: Image acquisition workflow diagram. 1, An order is placed in the electronic health record (EHR) and transmitted via Health Level 7 version 2 or Fast Healthcare Interoperability Resources to the radiology information system (RIS) using the Integrating the Healthcare Enterprise Order Filler profile. 2, The exact procedure may be refined or protocolled in the RIS prior to 3, being communicated to the modality via Digital Imaging and Communications in Medicine (DICOM) Modality Worklist. 4, The images are acquired by the scanner and sent with associated metadata in DICOM format to the image server, which may then 5, forward the DICOM images to an AI orchestrator. 6, The AI orchestrator then de-identifies the images if necessary and sends them to the appropriate AI algorithm(s). 7, The AI algorithms process the images and return the AI results to the AI orchestrator, which then 8, associates the AI results with the appropriate imaging study and sends the results back to the image server. 9, 10, The image server will forward the DICOM studies and AI results to the image viewer, where 11, the studies and associated AI results are reviewed, and a report is generated. AI = artificial intelligence, PACS = picture archiving and communication system, RSNA = Radiological Society of North America, VNA = vendor-neutral archive.

Figure 2: Postacquisition workflow. 12, 13, The image server and image viewer may log activities via the Integrating the Healthcare Enterprise standardized operational log of events profile for monitoring and business analytics. 14, Consent for future research may be obtained and stored in a Fast Healthcare Interoperability Resources (FHIR) database. 15, A cohort identification tool can then query the FHIR database to establish a research cohort and confirm consent, then 16, query the image server for related Digital Imaging and Communications in Medicine (DICOM) studies, which may subsequently 17, be de-identified and sent to an image research server with graphics processing unit (GPU)–accelerated computing capabilities for training future AI algorithms. PACS = picture archive and communication system, VNA = vendor-neutral archive.
Adapting to the COVID-19 Pandemic
Planning for the demonstration began in January 2020, followed by remote, distributed planning and Internet testing between vendors over the summer. The 16 vendor teams were to convene in Chicago, Illinois, in September 2020 for system testing, with systems to be shipped to McCormick Place for setup on the Friday before the RSNA 2020 Annual Meeting. In May 2020, it was announced that RSNA 2020 would be held virtually due to the COVID-19 pandemic. IAIP participants pivoted to create demonstration videos for online consumption, while continuing to implement and test AI tool integration. To address issues of Internet security, one company volunteered to host several application virtual machines to support integrations in a simulated, virtual environment. Although some companies chose to provide video clips of their tool integration, most chose to proceed with testing over the Internet to experience the benefit of integrating within the clinical workflow.
Radiologists and trainees from several institutions volunteered to serve as clinical champions and to develop narrative scripts for each demonstration. An introductory video was created to set the scene. Clinical champions provided invaluable insight in defining data flow and when and where CDEs should be added; they suggested changes to user interfaces and provided guidance on unbiased interpretation, as well as pragmatic modifications of AI results. Clinical relevance was emphasized throughout the demonstration. In September 2020, three team recording sessions were held, generating approximately 30 video hours (edited to approximately 15 minutes per video). The four IAIP 2020 videos depicting the integrated systems are freely available for viewing on the RSNA website (7), as well as on the RSNA's YouTube channel (https://www.youtube.com/user/RSNAtube) (8).
Showcase Results Dissemination
Although the demonstration was presented to RSNA attendees as a set of online videos, integration of the 26 products involved was achieved through iterative remote testing. These systems were able to perform the functions needed to support a clinical scenario that included imaging examination ordering, protocoling, acquisition, display, interpretation, reporting, and follow-up recommendations. During the online-only RSNA 2020 Annual Meeting, the demonstration videos were made accessible to meeting registrants through the AI Showcase component of the online meeting. The IAIP demonstration videos were viewed by 850 distinct attendees during the meeting week, on par with informatics demonstrations at previous in-person RSNA meetings. The video format allowed for persistent access and broader exposure. The four demonstration videos were made available as enduring materials through RSNA's YouTube channel (8), receiving a total of 4780 views as of June 1, 2021.
Lessons Learned
Perhaps the most important lesson learned in development of the IAIP demonstrations was the need for consistent and coordinated interoperability standards and orchestration profiles. Whereas many imaging-focused AI vendors expected to work exclusively with DICOM objects, many of the reporting-focused vendors expected to work with FHIR resources. At the time of the demonstration, IHE profiles for AI Workflow Integration (AI-WI) and AI Results (AIR) were in development but not yet published, so there was no consensus for conversion between DICOM and FHIR resources in this context. Given time constraints, only initial steps toward AI results integration into reporting systems was achieved. Continued development of these integrations is a goal for future work in the 2021 RSNA IAIP demonstrations.
In addition to consensus for conversion between DICOM and FHIR resources, use of the IHE AI-WI and AIR profiles will afford automation of data transfers, resulting in increased efficiency and scalability of workflows to greater numbers of studies and more complex examinations. While none of the AI use cases included in these demonstrations were designed to compare results across multiple studies obtained at multiple points in time, the use of metadata for workflow orchestration in the AI-WI profile should facilitate comparison to prior imaging and prior AI results. Given the metadata inputs required for these profiles to function, standardization in naming of study types and series acquired will help facilitate these workflows and identification of prior studies for comparison.
Another key lesson learned was the need for radiologist interaction with AI results. Even when AI results are not optimal, the mechanism of integration into a structured report can still provide efficiency gains if the results can be adjusted before generating structured data. A corollary to this lesson was the realization that some CDEs may be generated at different points in the clinical workflow and by different AI tools. For example, intracranial hemorrhage may be excluded at non–contrast-enhanced head CT, while presence and location of large vessel occlusion may not be determined until CT angiography is performed. This presents a challenge for integration in identifying the most accurate results for each CDE across multiple AI models and/or radiology reports, so implementation with radiologist input is essential.
Furthermore, the capture of the interaction between the radiologist and the AI results is important information for feedback on AI tool performance and future model development. This point is emphasized in the postacquisition workflow (Fig 2) portion of the demonstrations, where these data are fed back into the research server.
Collaboration of radiologists and vendors proved essential in the fast-paced development environment used for the IAIP showcase, which resulted in clinically relevant tools that were also technically feasible. Multiple tasks throughout the radiology workflow can be augmented by AI tools. In order for such applications to be adopted clinically, they must be embedded seamlessly into the workflow. Integration standards and orchestration frameworks proved to be key to the demonstration's success; it is recommended that they be included in any request for proposals or contract with imaging system vendors.
Acknowledgments
The authors would like to thank the members of the RSNA IAIP Demonstration Team: Adam Flanders, MD, Charles Kahn, MD, Marta Heilbrun, MD, Tarik Alkasab, MD, PhD, George Shih, MD, Krishna Juluru, MD, Ali Dhanaliwala, MD, PhD, Madhavi Duvvuri, MD, Shaan Sadiq, MD, Tessa Cook, MD, PhD, Matt Morgan, MD, Ken Wang, MD, and Jamie Dulkowski. We would also like to thank the vendors who contributed to the demonstrations: 3M M-Modal, Ambra Health, Annalise.ai, Blackford Analysis, Fovia AI, GE Healthcare, Lunit, Medo AI, Nuance, NVIDIA, Philips Healthcare, Rad AI, RADLogics, Siemens Healthineers, Smart Reporting, and Visage Imaging.
Author Contributions
Author contributions: Guarantors of integrity of entire study, W.F.W., T.M.S.S., K.P.A.; study concepts/study design or data acquisition or data analysis/interpretation, all authors; manuscript drafting or manuscript revision for important intellectual content, all authors; approval of final version of submitted manuscript, all authors; agrees to ensure any questions related to the work are appropriately resolved, all authors; literature research, W.F.W., K.M., T.M.S.S., S.D.O., M.D.K., K.P.A.; clinical studies, S.D.O.; experimental studies, T.M.S.S., S.D.O., K.P.A.; statistical analysis, C.D.C.; and manuscript editing, all authors
* W.F.W. and K.M. contributed equally to this work.
References
- 1. . The present and future of deep learning in radiology. Eur J Radiol 2019;114: 14–24. Crossref, Medline, Google Scholar
- 2. . A Roadmap for foundational research on artificial intelligence in medical imaging: from the 2018 NIH/RSNA/ACR/The Academy Workshop. Radiology 2019;291(3):781–791. Link, Google Scholar
- 3. . Bending the artificial intelligence curve for radiology: informatics tools from ACR and RSNA. J Am Coll Radiol 2019;16(10):1464–1470. Crossref, Medline, Google Scholar
- 4. . Use of radiology procedure codes in health care: the need for standardization and structure. RadioGraphics 2017;37(4):1099–1110. Link, Google Scholar
- 5. IHE Radiology Technical Framework Supplement: AI Workflow for Imaging (AIW-I). Integrating the Healthcare Enterprise. https://www.ihe.net/uploadedFiles/Documents/Radiology/IHE_RAD_Suppl_AIW-I.pdf. Updated August 6, 2020. Accessed June 14, 2021. Google Scholar
- 6. IHE Radiology Technical Framework Supplement. AI Results (AIR). Integrating the Healthcare Enterprise. https://www.ihe.net/uploadedFiles/Documents/Radiology/IHE_RAD_Suppl_AIR.pdf. Updated August 6, 2020. Accessed June 14, 2021. Google Scholar
- 7. Imaging AI in Practice Demonstration. Radiological Society of North America. https://www.rsna.org/education/ai-resources-and-training/ai-imaging-in-practice. Published 2020. Accessed June 14, 2021. Google Scholar
- 8. RSNA. Imaging AI in Practice Demonstrations. YouTube. https://www.youtube.com/playlist?list=PLEUiLXWVNND1TGgjsnjZybcOkGhotFy2U. Published 2020. Accessed June 14, 2021. Google Scholar
Article History
Received: June 14 2021Revision requested: July 27 2021
Revision received: Sept 14 2021
Accepted: Oct 12 2021
Published online: Oct 27 2021