ButSpeak.com
News which Matters.
A groundbreaking AI algorithm developed by Iraqi and Australian researchers can predict diseases with 98% accuracy by analyzing tongue color, offering potential advancements in medical diagnostics.
A pioneering computer algorithm has achieved a remarkable 98% accuracy rate in diagnosing a range of diseases through the analysis of tongue color, marking a significant advancement in medical diagnostics. This innovative imaging system, developed by a collaborative team of Iraqi and Australian researchers, promises to enhance disease prediction and management by leveraging the subtle changes in tongue coloration.
The breakthrough was achieved by engineering researchers from Middle Technical University (MTU) and the University of South Australia (UniSA). In a series of experiments, the team utilized 5260 tongue images to train machine learning algorithms capable of detecting variations in tongue color associated with various health conditions. Their research included 60 patient images from two teaching hospitals in the Middle East, demonstrating the system’s efficacy in correlating tongue color with specific diseases.
The system’s ability to diagnose conditions such as diabetes, stroke, anaemia, asthma, liver and gallbladder issues, COVID-19, and other vascular and gastrointestinal disorders underscores the potential of artificial intelligence in advancing medical diagnostics. AI can replicate a practice rooted in traditional Chinese medicine, where examining the tongue has long been used to detect health issues.
Senior author Ali Al-Naji, Adjunct Associate Professor at both MTU and UniSA, highlights that the AI model mirrors a 2000-year-old diagnostic technique. According to Al-Naji, different tongue colors can indicate specific health conditions: a yellow tongue may suggest diabetes, a purple tongue with a greasy coating could be associated with cancer, and an unusually shaped red tongue might indicate an acute stroke. Other conditions, such as anaemia and severe COVID-19, can also be inferred from tongue coloration, with colors ranging from white to deep red and indigo.
The imaging system captures tongue color using cameras positioned 20 centimeters from the patient, providing real-time predictions of health conditions. Looking ahead, co-author Professor Javaan Chahl from UniSA envisions the potential for this technology to be adapted for smartphone use, further democratizing access to advanced diagnostic tools and facilitating easier and more widespread health monitoring.
This advancement represents a significant leap in integrating AI with traditional medical practices, showcasing how technology can enhance our understanding and management of health conditions through seemingly simple yet effective means. For more details, you can access the full research paper here.