Summary: The blog is an excerpt from the presentation Dr Indrajeet Das, Consultant Cardiothoracic Radiologist, University Hospital Leicester, NHS Trust did at UKIO 2022. The presentation is based on the ongoing study participating Qure.ai’s qXR solution to determine the role of artificial intelligence in lung cancer pathways. It focuses on the role of an AI-enabled chest X-Ray solution in augmenting Lung Cancer care and diagnosis Pathways.
Lung cancer is considered the leading cause of cancer mortality and the third most common cause of cancer in the UK. It is estimated that approximately 47,000 people in the UK are diagnosed with lung cancer each year. What is incredible is that more than a third of those patients get diagnosed after presenting as an emergency. If you think about that, the first time you may have a cancer diagnosis is when you come in as an emergency admission in the Accident and Emergency department. It is pretty frightening, actually, when you think about it. Even more unfortunate is that almost nine in 10 patients that present as an emergency case are actually at late-stage cancer, typically categorized as stage three or four. In fact, emergency presentations are so poor that a study done at our hospital found that patients who presented as an emergency were five times more likely to die within a year than those referred through primary care.
Many of these patients would have undergone a chest X-ray at some point in their diagnostic pathway. The primary reason for Lung Cancer patients presenting themselves at a later stage is that it is challenging to arrive at a clinical diagnosis early on, as most patients do not present with the typical red flags. Instead, they may have quite esoteric symptoms, such as a chronic cough. However, the catch is that the best-proven intervention shows improved outcomes only when diagnosed early.
The National Optimal Lung Cancer pathway was drafted in 2017 to overcome the challenges faced by lung cancer and the high mortality rate. This was to try and radicalize the diagnosis and treatment of lung cancer so that it happened at a much earlier stage.
The pathways before the National Optimum Lung Cancer Pathway came into being were typically 62 days long when it comes to the duration between time to diagnosis and time to treatment. With the new pathway in place, clinicians and radiographers were urged to report any suspicious chest X-Ray, be it routine or urgent, ideally within 24 hours. It needed to be further optimized to accommodate for a CT to be done before the patient leaves the hospital and reported within 72 hours of the actual x-ray being done if possible. The other side of the pathway is where hospital referrals have a CT within 24 hours. These are then referred to the cancer team. We are more familiar with this pathway, which is already happening in most centers around the country. The primary reason for me to emphasize the terms: ‘routine’ or ‘urgent’ is because sometimes you don’t know whether something is urgent until you’ve seen the X-ray. And I think this is where the value of AI potentially lies.
Now, all this talk of rapid pathways, chest X-ray to CT to the clinic within 24 hours, and direct biopsy leading to quick turnaround times are great for the patient. It’s incredibly aspirational as we face an extreme shortage of radiologists in this country. We’ve got one of the lowest numbers of radiologists per 100,000 People in the European Union. And a lot of the work we do has to be outsourced. For instance, 60% of our X-rays at the trust are outsourced. Ideally, you want to keep the abnormal X-rays in-house because you need to identify and manage them early. So, if we are to pick a group we’d like to outsource, we’d ideally want to outsource the ones most likely to be normal to prevent any delays in care. Hence the obvious thought then would be to introduce an AI tool that could potentially help us triage the abnormals within the normal such that even if we didn’t have the resource power to report all the X-rays in-house, at least we pick the 50% or 40% that is of priority. However, the problem is that it’s not good enough to have an imaging AI that flags abnormal X-Ray. Unless the CTs are performed and reported at equal momentum and the patients are reviewed and assessed quickly, it will not make a difference. In some ways, it makes it worse because you’ve got a patient who has an abnormal X-ray but no pathway to try and triage that patient through quickly. You must continue waiting as there is no robust structure to adhere to. So, it’s essentially like a car without wheels. Without a pathway to serve it, there is no point in having a rapid diagnostic tool like AI.
The earlier pathway leads an abnormal chest x-ray back to the referring GP. The GP will then refer the patient to the two-week wait pathway. They would then be reviewed by a respiratory physician or have a chest CT and eventually have an outcome. To try and radicalize this, many trusts around the country, including us, started work towards something called the straight to CT pathway. In this pathway, chest X-rays with suspected lung cancer have a code, which goes to a group of coordinators who contact the patient for an early triage CT. But we went one step further. And this is very much inspired by the Manchester rapid model, where we thought that it’s not just good enough to have the CT performed through a typical list, but let us set up early morning, between eight to nine slots, Monday to Friday, to prioritize them. This was to ensure that the patients get the CT done, see a triage nurse, get assessed quickly, and their CTs get hot reported. This prioritization has reduced turnaround time and gets the patients reviewed and evaluated in case they need elective admission or elective review. Basis the results of this modified model, we established a new pathway in 2018 and audited it in 2019. Since launching the pathway, there has been a reduction in emergency presentations by about 10%, despite an increase in lung cancers being diagnosed. In relative terms, we saw a 25% reduction in emergency admissions. This works out not only as a cost-saving effort in emergency admission but is incredibly important for the patient pathway, where these new cases are caught early on and not presented as an acute emergencies. We see these examples in real practices, where patients with a chest X-ray are notified of a bronchus that’s blocked or about to be blocked, get caught early enough for a surgeon to have a stent put in, or are assessed early on. We know that if they had waited a few more days, they could have presented as an acute emergency much later.
Allowing the straight to CT pathways enabled us to identify these patients early on, and from an AI point of view, it allowed us some excellent data. We now had a code that could be easily audited. We have CTs being performed within a few days of a chest X-ray that could potentially ground truth validate an AI software that flagged the abnormal reports. We also had clearly defined metrics — time to report, time to diagnosis, time to scan. All these were enough ground to test an AI algorithm to see if we could establish that routine or urgent chest X-rays suspected of lung cancer within 24 hours, which we were still not reaching.
The next thing we did was create a validation set. We picked two weeks of consecutive GP X-rays, which were referred through our pathway for an urgent CT. We applied the same code to determine whether we can use an AI to rely on to report the cancers first, to see if we would save time to report. We worked with another commercial vendor and got some excellent results, which we presented at the British Thoracic Oncology Group meeting just before the pandemic started in early 2020. But we felt we needed further work because we only looked at a small set and therefore was a small outcome study, just looking at the abnormal ones we coded. And we felt that it needed more work to be able to gain trust in this software. Then COVID struck. The project was on hiatus for a while. But after that, we felt that to take this to the next level, we needed a real-world perspective validation trial where we had a randomization arm. We had clearly defined measurable outcomes that were difficult to argue against in terms of improvement.
And we met Qure. I met them through RSNA, and we realized that they were a commercial company with a growing reputation. But the important thing was that they were willing to participate in a research study, even though they are a commercial company. They were willing to participate in an externally funded NHS England or SBRI-funded study, reducing the risk of getting involved with a commercial vendor. Moreover, Qure had already contacted some crucial national lung cancer leads and stakeholders. So, for us, it was an easy decision to partner with them. And the footprint of Qure was very admirable, they’ve got lots of validation studies across several countries and several products, and the qXR product, in particular, has been very well touted. From what I’ve read and heard, it provides a comprehensive solution, not just cancer. Still, it covers a range and a list of diagnoses to try and triage and diagnose X-rays with high accuracy.
We did a validation with Qure’s product using the same two weeks that we used previously. But this time, we decided to look at everything. So, we essentially divided every single x-ray into normal and abnormal. And we had set definitions for what we defined as normal and abnormal. It could include a rib fracture or a pneumothorax. I got SD five and an SD two; I spent months just collating the data together, looking at the report issued by the radiologist, and comparing it with the Qure DICOM read. They will then cross-check to see whether there was a correlation and anything they needed arbitration for — they contacted me, and I arbitrated it. From that, we got false positives and true positives. These were the results we got for abnormality measures, which were pretty impressive. The critical thing was that it was a 100% retrospective analysis, and the AI solution proved to be a good enough prioritization tool. As I mentioned earlier, even if about a third are abnormal chest x-rays and we overhaul it by 10%, we would be happy to pick the correct 40% with the help of the AI.
Next, we looked specifically at lung cancer. We had 10 Lung cancers in that group of 1448, of which nine were histologically confirmed. One was radiologically confirmed with no biopsy. And qXR correctly identified eight of them. It missed two, which were very subtle. Barring some report delays, most were reported on the same day.
On ground feedback from Radiologists in determining the value of AI
We surveyed some of our radiology colleagues that are mainly involved in cardiac chest imaging. And what’s interesting is that, despite us being dedicated chest radiologists, only half of us have any time in our job dedicated explicitly to reporting chest X-rays, about half. Of those that do, only a few of us have up to 2.2 by pas, which is two hours of reporting time in a week dedicated to X-rays. So even though we are specialists in the field, due to the cross-sectional work, MDTs imaging, and acute hub biopsies, everything takes up so much of our time in practice that a lot of the reports we do are mainly checking the registrar’s work.
There is an opportunity here. In the one or two hours we get, we get overwhelmed by the sheer number of x-rays left for reporting. Do I look at the GP X-rays? Do I look at the CDU, A&E, and so on? Clarity does affect productivity. In this case, when there are a lot of unreported X-rays, it does affect your focus. By the time you allocate an X-ray to the team, there is a chance that you might have missed the boat because the process of allocation is a manual process. Here we need a system that flags abnormal x-rays based on where they come through. It could be color-coded, but our acute work happens through a different risk workflow. We report our acute CT scans through an auto-populated window linked to our primary arrays. This window auto-populates every CT being done acutely in the trust, as an inpatient or otherwise. We report them on an ad hoc basis. We could exponentially improve our radiologists’ workflow if we had a similar system with an AI tool prioritizing things in some of these areas. I believe that, just by better pathways, education, and good use of our reporting radiography team, we can demonstrate increased productivity within the limits of the existing resources.
Once we had Qure.ai’s solution validated, we thought partnering with them as part of the SBRI grant was an excellent opportunity. As the grant was multicentre, many top leaders in the field were involved. The purpose of the partnership was to look at that 24-hour pathway. For example, a chest X-Ray that is routine or urgent to be read within 24 hours using the help of AI. And as a result, hopefully, it will reduce the time to diagnose lung cancer.
We were fortunate enough to partner with Qure.ai, winners of a significant grant from the SBRI, to help try and establish this. We hope that the benefits of this pathway will go beyond clinical benefits and cover cost-effective and patient benefits as well. We hope that the software will significantly improve not just the turnaround time for CTs but also the quality of reports, safety of reports, reduction of misses, and help change the workflow of the Radiology Departments. We hope the AI can bring in a more productive workflow management system and significantly improve early treatment and detection for patients.
To summarise, I’d like to say that lung cancer has a poor outcome primarily due to its late diagnosis compared to other cancers. But AI has the potential to impact the detection and diagnosis of cancer. Streamlining back-end pathways such as the straight-to CT pathway and daily triage pathway is essential in determining the maximum impact of a given AI solution. I couldn’t emphasize that more. Early diagnosis can lead to earlier treatment and early pickup of cancer, but as we demonstrated in Leicester, a reduction in emergency presentations is equally as important.