-
Another classic AI paper
Posted by Unknown Member on October 9, 2020 at 10:54 amCONCLUSION:
AI AS GOOD OR BETTER than middle rad residents
EXCEPT:
In general, residents performed better for more subtle anomalies, such as masses and nodules, misplaced lines and tubes, and various forms of consolidation, while the AI algorithm was better at detecting nonanomalous findings, the presence of tubes and lines, and clearly visible anomalies, such as cardiomegaly, pleural effusion, and pulmonary edema,” the authors wrote. “Conversely, the AI algorithm generally performed worse for lower-prevalence findings that also had a higher level of difficulty of interpretation, such as masses or nodules and enlarged hilum.”
NICE.
The residents found misplaced tubes, pneumonia and cancer.
BUT AI was better at always noting if if a tube or lined was present or not and other useless BS like possible Chf / enlarged cardiopericardial sillhouette which residents presumably ignored when the NGT was in the lungs.
Caveat: I am commenting on the AM summary and havent read the paper yet.
anabarcelo_18 replied 3 years, 11 months ago 8 Members · 12 Replies -
12 Replies
-
Haha, yes, for things that really matter to patients … humans are better.
But AI costs less. Shocking news, once again.-
Unknown Member
Deleted UserOctober 9, 2020 at 12:19 pmIs IBM’s AI for medical imaging similar to Tesla’s lead on AI for self driving cars? Anyone else close to IBM’s AI?
-
IBM doesn’t have AI. Keep an eye on Google / Apple and startups.
Chinese also making real progress.
-
Were still early. Progress will be quick. Already being used to prioritize cases. Clock ticking to decreasing rad demand drastically. AI doesnt have to be nearly as good as us. Money>>>quality.
-
Unknown Member
Deleted UserOctober 9, 2020 at 1:36 pmWhy people keep saying that AI will decrease the demand?
AI will mark 10 different things on a study that classically radiologists used to pass by them but now they have to “clear all of them” which will take more time.
-
Quote from Hospital-Rad
Why people keep saying that AI will decrease the demand?
AI will mark 10 different things on a study that classically radiologists used to pass by them but now they have to “clear all of them” which will take more time.
Exactly, non-radiologists will be just, if not more scared than now.
“The AI pointed out 10 potential findings, Dr. Gastroenterologist, sir. Why didn’t you follow all of those up?”
LOL
-
-
Quote from Drrad123
Were still early. Progress will be quick. Already being used to prioritize cases. Clock ticking to decreasing rad demand drastically. AI doesnt have to be nearly as good as us. Money>>>quality.
Yes, the quality part is the overall loss to the system, the gain being an attempt to get greater margins.
The funny thing is that “progress” is always assumed. More futurist religion stuff.-
AI already prioritizing cases to be read at some telerads. Filtering out normals is next. Eventually providing reads.
I couldnt imagine some of the technology we have now as a kid. Accept the reality, and adjust for an uncertain future.
-
One of the diagnostic mammo’s I read on Friday was from an outside screening from the San Francisco area. The only comment in the report on what they wanted worked up was that it was marked by CAD in the left breast. They probably found some fossil that they could pay pennies for that gold.
-
Well I read the Ai study in its entirety and this is my take on it.
PACSMan
[color=”#0000ff”]From the IBM study:[/color]
There was no statistically significant difference in sensitivity between the AI algorithm and the radiology residents, but the specificity and positive predictive value were statistically higher for AI algorithm.
[b]Meaning[/b] These findings suggest that well-trained AI algorithms can reach performance levels similar to radiology residents in covering the breadth of findings in AP frontal chest radiographs, which suggests [color=”#ff0000″]there is the potential for the use of AI algorithms for preliminary interpretations of chest radiographs in radiology workflows[/color] to expedite radiology reads, address resource scarcity, improve overall accuracy, and [color=”#ff0000″]reduce the cost of care. [color=”#000000″][u][i][b]Translation-[color=”#0000ff”] [/color][/b][/i][/u][/color][color=”#0000ff”]To me this reads use AI vs a rad…[/color]
[/color][color=”#ff0000″][b]Objective[/b] To assess the performance of artificial intelligence (AI) algorithms in realistic radiology workflows by performing an objective comparative evaluation of the preliminary reads of anteroposterior (AP) frontal chest radiographs performed by an AI algorithm and radiology residents. [color=”#ff0000″][color=”#000000″][u][i][b]Translation[/b][/i][/u]-[/color] [color=”#0000ff”]AI vs rads[/color][/color][/color]
[color=”#ff0000″][color=”#ff0000″][b]Conclusions and Relevance[/b] [color=”#ff0000″]These findings suggest that it is possible to build AI algorithms that reach and exceed the mean level of performance of third-year radiology residents for full-fledged preliminary read of AP frontal chest radiographs. [color=”#000000″][u][i][b]Translation[/b][/i][/u]- [/color][color=”#0000ff”]AI is better than 3rd year residents.[/color]..[/color]
[/color][/color]
[color=”#ff0000″][color=”#ff0000″]This diagnostic study also found that while the more complex findings would still benefit from expert overreads, the performance of AI algorithms was associated with the amount of data available for training rather than the level of difficulty of interpretation of the finding. [color=”#ff0000″]Integrating such AI systems in radiology workflows for preliminary interpretations has the potential to expedite existing radiology workflows and address resource scarcity while improving overall accuracy and reducing the cost of care.[u][i][b][color=”#000000″] Translation-[/color] [/b][/i][/u][color=”#0000ff”]Key words here are “preliminary interpretations”- Let AI read it first then if anything is found send it over to the rad. If not then rely on the AI interpretation to [u][i][b]”reduce the cost of care” [/b][/i][/u]. You are going to see more and more of this as time goes on. [/color][/color][/color][/color]
Just another “AI is better than rads” study couched differently….-
They really should of done this study on a AI vendor who has a FDA cleared AI for Chest like Zebra Medical Vision for real world results.
-
Quote from ThePACSman
Well I read the Ai study in its entirety and this is my take on it.
PACSMan[color=”#0000ff”]From the IBM study:[/color]
There was no statistically significant difference in sensitivity between the AI algorithm and the radiology residents, but the specificity and positive predictive value were statistically higher for AI algorithm.
[b]Meaning[/b] These findings suggest that well-trained AI algorithms can reach performance levels similar to radiology residents in covering the breadth of findings in AP frontal chest radiographs, which suggests [color=”#ff0000″]there is the potential for the use of AI algorithms for preliminary interpretations of chest radiographs in radiology workflows[/color] to expedite radiology reads, address resource scarcity, improve overall accuracy, and [color=”#ff0000″]reduce the cost of care. [color=”#000000″][u][i][b]Translation-[color=”#0000ff”] [/color][/b][/i][/u][/color][color=”#0000ff”]To me this reads use AI vs a rad…[/color]
[/color][color=”#ff0000″][b]Objective[/b] To assess the performance of artificial intelligence (AI) algorithms in realistic radiology workflows by performing an objective comparative evaluation of the preliminary reads of anteroposterior (AP) frontal chest radiographs performed by an AI algorithm and radiology residents. [color=”#ff0000″][color=”#000000″][u][i][b]Translation[/b][/i][/u]-[/color] [color=”#0000ff”]AI vs rads[/color][/color][/color]
[color=”#ff0000″][color=”#ff0000″][b]Conclusions and Relevance[/b] [color=”#ff0000″]These findings suggest that it is possible to build AI algorithms that reach and exceed the mean level of performance of third-year radiology residents for full-fledged preliminary read of AP frontal chest radiographs. [color=”#000000″][u][i][b]Translation[/b][/i][/u]- [/color][color=”#0000ff”]AI is better than 3rd year residents.[/color]..[/color]
[/color][/color]
[color=”#ff0000″][color=”#ff0000″]This diagnostic study also found that while the more complex findings would still benefit from expert overreads, the performance of AI algorithms was associated with the amount of data available for training rather than the level of difficulty of interpretation of the finding. [color=”#ff0000″]Integrating such AI systems in radiology workflows for preliminary interpretations has the potential to expedite existing radiology workflows and address resource scarcity while improving overall accuracy and reducing the cost of care.[u][i][b][color=”#000000″] Translation-[/color] [/b][/i][/u][color=”#0000ff”]Key words here are “preliminary interpretations”- Let AI read it first then if anything is found send it over to the rad. If not then rely on the AI interpretation to [u][i][b]”reduce the cost of care” [/b][/i][/u]. You are going to see more and more of this as time goes on. [/color][/color][/color][/color]Just another “AI is better than rads” study couched differently….
if AI starts clearing the list of normal or uncomplicated studies, rads will be in huge trouble. a normal ct cap and a complex ct cap reimburse the same.
tbh, i think we are 10-15 years away from serious disruption. have we already passed the point of when we should limit training spots? probably. but many of us younger trainees will have to find out the hard way.
-
-
-
-
-
-
-
-