A groundbreaking AI model developed by researchers from Iraq and Australia can diagnose various diseases with up to 98% accuracy by analyzing tongue images, combining traditional Chinese medicine practices with cutting-edge technology. As reported by PopSci, this innovative system draws inspiration from a 2,000-year-old diagnostic approach, using machine learning to detect health conditions ranging from diabetes to COVID-19 based on tongue color and texture.
The innovative AI tongue scanner utilizes a machine learning algorithm trained on 5,260 images spanning seven colors across different saturations and light conditions1. This dataset included 300 "gray" entries representing unhealthy tongues and 310 "red" selections for healthy examples. The system was further refined using 60 photos from Iraqi teaching hospitals in Dhi Qar and Mosul, showcasing a mix of healthy human tongues and those with various diseases1. During testing, volunteers positioned their tongues 20cm from a USB webcam connected to the program, allowing for real-time analysis1. This technology, developed by researchers from the University of South Australia and Iraq's Middle Technical University, modernizes an ancient diagnostic practice while offering a potentially cost-effective and efficient method for disease screening12.
Tongue color and texture serve as crucial indicators for various health conditions, according to the research team led by Ali Al-Naji. A yellow tongue often suggests diabetes, while a purple tongue with a thick greasy coating may indicate cancer. Acute stroke patients typically present with an unusually shaped red tongue, and anemia is associated with a white tongue12. Severe COVID-19 cases frequently exhibit a deep red tongue, while indigo or violet coloration can point to vascular, gastrointestinal issues, or asthma3. This diagnostic approach, rooted in traditional Chinese medicine, has been practiced for over two millennia and is now being revolutionized through the integration of artificial intelligence14.
The AI tongue scanner demonstrated remarkable precision during testing, achieving a 96.6% accuracy rate when analyzing 60 tongue images1. In a broader experiment, the system's accuracy exceeded 98% in detecting various ailments that show apparent changes in tongue color1. These impressive results were obtained using a simple setup where volunteers positioned their tongues 20cm from a USB webcam for scanning1. The high accuracy rates highlight the potential of this technology as a reliable diagnostic tool, capable of efficiently identifying different health conditions based on tongue appearance2. Researchers believe these experiments illustrate the promising feasibility of incorporating similar or improved AI systems into medical facilities, potentially offering a secure, efficient, and cost-effective method for disease screening1.
Researchers envision integrating this AI technology into smartphone apps, enabling users to receive instant health assessments by simply capturing a tongue photo1. This advancement could significantly enhance accessibility and convenience in medical diagnostics2. However, before widespread adoption, the system must overcome challenges such as ensuring patient data privacy and mitigating the impact of camera reflections on diagnostic accuracy3. Despite these hurdles, the potential for a secure, efficient, and user-friendly method of disease screening that combines modern technology with centuries-old practices remains promising21.