MedCognetics receives FDA 510(k) clearance for breast cancer screening software

QmTRIAGE analyzes 2D full-field digital mammography to flag abnormalities for radiologists' review.
By Jessica Hagen
12:14 pm
Share

Photo: Radovanovic96/Getty Images

 

Texas-based MedCognetics received FDA 510(k) clearance for its AI-powered breast cancer screening software QmTRIAGE.

QmTRIAGE uses AI to analyze 2D full-field digital mammography screenings and flags those suggestive of abnormalities for radiologists' review.

MedCognetics' software uses datasets gathered from de-identified clinical data from UT Southwestern Medical Center in Dallas and intellectual property from UT Dallas' Quality of Life Technology Laboratory to improve early breast cancer detection.

UT Southwestern Medical Center and UT Dallas hold equity in the company. 

"MedCognetics is committed to leveraging our technology to help improve outcomes across a diverse group of patients, and to do so, partnered with both University of Texas at Dallas and University of Texas Southwestern Medical Center (UTSW) to address these disparities. In addition to this, our software's high detection accuracy enables reduced time for review by radiologists, another key component to improved outcomes. The FDA's clearance is a very important first step for us as we work toward expanding to other realms of cancer," Debasish Nag, CEO of MedCognetics, said in a statement.

THE LARGER TREND

Tech giant Google developed AI-based mammography technology that decreased the rate of false positives and false negatives, outperforming radiologists in a study published in Nature in 2020. 

Last month, med-tech company iCAD announced it would incorporate the Alphabet subsidiary's mammography AI technology into its breast-imaging solutions, thanks to a strategic development and commercialization agreement, which brings the Google technology into clinical practice.   

With AI's growing use in healthcare, experts have relayed the importance of organizations ensuring bias doesn't exist within their data by including individuals from diverse backgrounds in datasets.

A study published earlier this year in the Journal of the American Medical Informatics Association noted AI models that perform well for one group of people could fail for other groups; therefore, bias in AI and machine learning requires a holistic approach that requires numerous perspectives to address.

Share