Of course radiologists are freaking out because they don't understand how that's possible (probably because they've never tried) and want to neuter the AI so it's not "biased".
If it's correctly identifying them when they are just blank grey squares, maybe they just overtuned the modeling? I.e. the AI isn't actually detecting anything at all, it just has the dataset memorized.
Granted, I haven't read through all of it, but it seems to me that the grey square filtered are all images that have been fed to the AI before the filters were applied
From the replies this might be the AI detecting what is apparently an already known difference in bone density between races. Which makes this much less of an intriguing mystery, but still marks that radiologist as a wilfully ignorant cultist.
Has a radiologist ever tried to determine a patient's race from chest x-rays? Probably not. This result implies they probably could if they tried, and the proper scientific response would be to try to understand how the AI did it instead of being afraid of the "racist" AI.
If the program is written to produce the best possible health outcome for the individual patient, its ability to accurately infer additional data from inputs is unequivocally a good thing.
Also, the rightmost image in the post is actually a blank grey square, and its accuracy fell off approaching it. The last image with some degree of accuracy a human could recognize as a heavily blurred chest x-ray.
In other news, AI can detect race from chest x-rays with 99% accuracy and is so good at doing so it "can detect race from images filtered so heavily they are just blank grey squares"
Of course radiologists are freaking out because they don't understand how that's possible (probably because they've never tried) and want to neuter the AI so it's not "biased".
If it's correctly identifying them when they are just blank grey squares, maybe they just overtuned the modeling? I.e. the AI isn't actually detecting anything at all, it just has the dataset memorized.
Granted, I haven't read through all of it, but it seems to me that the grey square filtered are all images that have been fed to the AI before the filters were applied
From the replies this might be the AI detecting what is apparently an already known difference in bone density between races. Which makes this much less of an intriguing mystery, but still marks that radiologist as a wilfully ignorant cultist.
Any details on the bone density differences? I'm genuinely curious.
Here is one study on the subject.
What do you mean tried?
Has a radiologist ever tried to determine a patient's race from chest x-rays? Probably not. This result implies they probably could if they tried, and the proper scientific response would be to try to understand how the AI did it instead of being afraid of the "racist" AI.
Oh ok thanks I wasn't sure what you were getting at. That would be all too sensible.
If the program is written to produce the best possible health outcome for the individual patient, its ability to accurately infer additional data from inputs is unequivocally a good thing.
Also, the rightmost image in the post is actually a blank grey square, and its accuracy fell off approaching it. The last image with some degree of accuracy a human could recognize as a heavily blurred chest x-ray.