Health AI Register product review: RBfracture
We’re excited to share this product review of RBfracture™, produced by Health AI Register.
In this video, Ben Madden, Lead Reporting Radiographer at University Hospitals of Northamptonshire, shares his experience as a long-term user of RBfracture. He discusses how RBfracture was implemented, how it integrates into daily workflows, and the impact it has had in clinical practice.
Ben also provides a practical demonstration of how RBfracture is used in a real clinical setting.
Technical setup:
- RBfracture™ v2.3
- GE PACS
- Hybrid deployment: local gateway and cloud analysis
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
We’ve looked at an awful lot of metrics, but the key take-home metrics are that RBfracture is 94% to 95% accurate, and it’s reduced ED missed fracture rates by 94%. Which is quite a staggering amount.
Transcript of the RBfracture product review
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
…helps you, and you can see how quick it is to bring up the thumbnail. You can see the thumbnails. The main success is there has been a approximate 95% reduction of fractures. And we’re looking at impact in Orthopedics.
Stephan Romeyn
Health AI Register
Welcome everyone. I am Stephan Romeyn from Health AI Register, the most comprehensive overview of all the commercially available AI algorithms in radiology.
And we understood from our users that they would like to see more practical examples of how these applications are being used in clinical practice. That’s why we’re trying something new: Product review videos where we interview end users on how they are using these applications in their clinical workflow.
In this edition, we focus on RBfracture, a fracture detection tool from Danish company Radiobotics.
This is an AI application and computer aided detection software aiming to assist healthcare professionals in reviewing trauma skeletal X-rays. The intended users for this applications are healthcare professionals that are reviewing these cases in different clinical settings, including emergency care doctors, radiologists, and reporting radiographers.
The intended patient population age for this application is patients starting with the age of two for fracture detection and effusion detection, and patients with the starting with the age of 15 for dislocations and lipohemarthrosis.
Today we welcome Ben Madden from University Hospitals of Northamptonshire. Ben is already using this product for several years in his clinical practice. And in this video, together with Ben, we will go over the adoption process, his experiences, and also a small demo of the product and the impact he sees in clinical practice.
So, thanks a lot for for being here. Ben, can you give us a brief introduction about yourself and your role within the department?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
Hi, my name is Ben Madden. I’m the lead reporting radiographer at Kettering General Hospital, and I manage a small team of six reporting radiographers. And I deal with the onboarding of certain projects as well.
Stephan Romeyn
Health AI Register
Perfect, thank you very much. Looking forward to hear your insights.
So to get started, why did you actually started exploring AI applications for the fracture detection use case?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
So we wanted to improve ED X-ray trauma turnaround times and we also wanted to look at reducing ED missed fracture rates.
We did look at hot reporting, but it wasn’t financially viable. So this is when we started looking at AI products that could help us out.
So it was both for the turnaround times and reducing the missed findings.
Although technically it wouldn’t reduce turnaround times because it wouldn’t be seen as an official report, but at least it was something ED could look at and start managing if we deemed it to be accurate enough as such.
Stephan Romeyn
Health AI Register
Currently we see already more than, I think, five or six applications out there on Health AI Register for fracture detection.
How did you selected RBfracture? What were the key decisions in that?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
There was a few trusts that already had RB on board. We got good feedback from that. Also when you’re googling and trying to find all these different systems, RB was always top of the list.
In discussions with RB and a few others, it was the the feeling they gave us. And the turnaround in questions they gave us, the speed at which they got back to us.
The feeling of it was that there was no pressure. It was very much a conversation about how we could implement this, and they were very, very good about having a trial with us to have a look and prove its accuracy.
Stephan Romeyn
Health AI Register
And also I’d like to hear your experiences in really using this product in clinical practice. So maybe can you share your screen and show us an example of how you’re using this product in your clinical workflow?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
Yeah, absolutely.
So here we see a lumbar spine X-ray, an AP projection and a lateral projection. When RB analyzes these images, it will bring back a Summary Report which we can see here.
If there’s a positive finding on the Summary Report, it will then give you a secondary capture of the image in question. And it will draw a bounding box around the area it wants you to look at in more detail.
When it’s a dashed box, then it’s less sure. If it’s a solid box, it’s more confident of that.
So this then draws your attention to the L3 transverse process, which we can see there’s a fracture. We bring it up on the AP, and we can see there’s a fracture through that transverse process. It has missed the fracture the transpose process of L2, but when you start looking at L3, it’s very easy then to see the L2 fracture as well. So this helps you, and you can see how quick it is to bring up the thumbnail.
You can see the thumbnails there, and you can see it’s got a big red dot on it, which draws your attention to it that you need to look harder.
Stephan Romeyn
Health AI Register
So in this one indeed it really helps you in the replay. How do you do that? Do you first look at the image itself and then at the AI output, or how do you have to see that?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
Yeah, I mean it depends how you report. But me personally, I would like to look at the X-rays first, have a good look over them. And then as some sort of comfort blanket, look at RB’s analysis and whether I need to then look further or I can move on with it as I thought.
I have a negative case here where we can see the pelvis and in the lateral projection. We see in negative cases there is no secondary capture apart from the Summary Report, which is a greyed out dot. Which you then see, yeah, it’s assessed it as normal. I don’t need to look any more detail. I think it’s normal. We can move on.
They also have the information in your work list available at this point. We have an old system, PACS and RIS, so we can’t use the AI to triage the X-rays for us. But going forward, we will get upgraded and it will incorporate that. And then we can use it to triage and it will bring up all the fractures to the top of the list and we can then go through them and report straight away.
Stephan Romeyn
Health AI Register
And the first one you showed, you also showed that it missed a finding.
We all know of course that that AI is not perfect. So how do you deal with that in clinical practice?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
If it in this case, as it’s missed a fracture, then when we report it we have an audit system in the background.
So when we report it, as there is a fracture and RB called it not a fracture, we have an audit phase at the end where ED can then pick it up and then change their management if required.
But any X-rays that are missed or if there seems to be a trend in it missing fractures (for example, lateral hips can be a problem depending on condition of the X-ray), then we report that back to RB. They always have a solution, be it short- or medium-term, to work on to improve the accuracy. And there’s things we can do as well, especially if it’s projectional or exposure wise or something like that.
Stephan Romeyn
Health AI Register
So, then, of course you use the AI output in your reporting, but you still look at it yourself. And also you’re in close contact them with Radiobotics to also show them these missed findings, and they have a sort of system in place to monitor this performance over time for this application.
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
Yeah, ED is advised that the official radiology report is gold standard, and it’s not something you rely on too heavily. You just use it as a guide and we have a safety audit in place that ED do every day, and they pick up on any fractures that we call.
Stephan Romeyn
Health AI Register
How do you work with the ED department in this? Do you as radiographer always look first at the image and then the ED doctor? How do you do that?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
So, especially in the evenings and nights, the ED clinic will be looking at the image first. And then the next day, we will pick it up and report it, usually somewhere between the 12 to 16 hour range will report it to them.
So usually in a lot of the cases, especially at night, they will be managing the patient based on their clinical opinion and also RB’s opinion.
They do also have access to this RB output and they have the secondary capture as well. So they can act on on that straight away. But during the day, if there’s any query between RB and their opinion, they call us and we then give the official report.
Stephan Romeyn
Health AI Register
And if we can talk about the integration into the workflow. So what we normally hear it can be quite challenging as well to do the data orchestration. So to really make sure only the fracture cases, potential fracture cases will be forwarded through the algorithm. How do you do that in your clinical practice?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
Yeah, it can be tricky on when you’re initially setting up if you don’t really foresee a lot of these issues. But it all depends whether you’re sending from the X-ray modality itself or whether you’re initiated by some sort of PACS button.
Over time, we’ve realized that talking to RB, they’ve helped us out quite a lot with extra filtration on our gateway, on our side. That means it blocks things like chest X-rays, facial bones, and cervical spine from getting analyzed, or at least sending secondary captures back to our PACS. Because especially on abdomen X-rays as well, it can cause confusion. And based on the DICOM text, you then have an automatic workflow to automatically forward the right cases to the algorithm.
So at the minute we’re running off modality. We will be moving to a PACS-based solution, but at the minute it is via the modality. So literally everything goes to the gateway. The gateway filters it and then the right exams then go to the RB server, and they get bounced back.
Stephan Romeyn
Health AI Register
And can you tell us a bit about the impact you see in clinical practice? Do you have you been able to quantify that as well?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
We’ve looked at an awful lot of metrics, but the key take-home metrics are that RBfracture is 94% to 95% accurate, and it’s reduced ED missed fracture rates by 94%. Which is quite a staggering amount.
In our trial period, it was a 63% or 64% reduction, but going forward after that, we found it has reduced to 94%. So, it’s quite staggering.
And now we’re working on how… You know, the key takehomes for most trusts who want to onboard a system like this is: How much is it going to cost me, and how is it going to save that money for me? And that’s what we’re working on at the minute, to see how long it would take for a normal trust to pay back the subscription fee for this system.
Stephan Romeyn
Health AI Register
Oh, that’s interesting information. That’s something you also want to publish online in in the upcoming future.
And if we talk about the the key features of this application, so what do you see as the key features that really makes it worth using for you in your clinical practice?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
So, honestly, it has less of an impact when you’re an experienced reporter. It’s more for ED clinicians and, you could say, junior reporters could find it quite useful as well.
When you’re a experienced reporter, it’s less useful, but it does have a main perk and this is a big perk: The main errors that you can do as a reporter is satisfaction of search.
You see one problem, you finish, you’re done. But there could have been another issue, another fracture maybe. Or if you’ve got a foot X-ray, sometimes you don’t look to the peripheries of that X-ray, and you can miss a toe fracture, for example. Which isn’t very clinically relevant, but you know, nonetheless, it’s a fracture.
And even in this case that I’m showing, you know, it was quite easy to miss these transverse process fractures. But this is where it’s useful in those cases where you wouldn’t expect necessarily a fracture to be there, and it’s drawing your attention to that area, and you’re giving it a second look. That’s when we find it really useful.
Stephan Romeyn
Health AI Register
And how was the adoption process in your department?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
So, of course, sometimes it can be difficult to really truly adopt and let everyone use it in clinical practice.
But I found ED were really supportive. They’re really great. As a reporter, we’re all very skeptical. And it took a while, especially at the start where the system wasn’t as good as it is now, for us to trust it and believe in it. But now, I think we’re all there.
So the adoption process is pretty quick, really, because it’s a tool to help you. You could ignore it if you wanted to. But it’s always advantageous to have a look at these things and come to your own conclusion.
Stephan Romeyn
Health AI Register
If you could give them any advice, Radiobotics, what should they improve in that product?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
They do need to look at the lateral hip to see if there’s anything they can do there.
But every radiology department needs to also look at what they can do there as well in terms of improving radiographic practice exposures and making sure patients are appropriately dressed before arriving in the radiology department.
All these things help RB, and then RB are always doing something in the background to improve their weak areas.
And when we say weak areas, the lateral hip came out at about 92% accurate, and the average was 94% to 95% accurate. So, it’s not very far away, but obviously the lateral hip is a very significant area.
Stephan Romeyn
Health AI Register
Try to wrap up: Is there anything else that you would like to share to the audience maybe to also help them ensuring a responsible adoption of AI in their clinical practice?
Ben Madden
Lead Reporting Radiographer, Kettering General Hospital Foundation Trust
It was very challenging getting this over the line. I think if you’re thinking about AI and implementing it in your department, then I think it’s very important to reach out to your digital team for the policies behind that.
Also, reach out to governance as soon as possible. As soon as you can get that ball rolling the better. And the IT department to see how you’re going to implement this.
I think if you start initiating there, then you will quicken the process up significantly. So to really have this AI team on-site — with different backgrounds from clinical, technical, and regulatory — to really drive the the process forward.
Stephan Romeyn
Health AI Register
Really interesting, really interesting insights. Thank you very much, Ben.
And, as I said, this is new, so we’re also happy to hear the feedback from the audience. And we’re happy to hear other questions that we should ask in next series or maybe other products that we should cover next.
Thank you very much!