Google uses AI to predict early-disease

: Aba al-Hassan Abbas 2024-09-07 05:48

Google announced the establishment of an AI model that allows using audio signals to predict early signs of illness.


Google’s AI division uses bioacoustics, a field that blends biology and sound. Bioacoustics, in part, helps researchers gain insights into how pathogens affect the human voice. Our voices convey clear information about our well-being.


According to a Bloomberg report, the company has created an AI model that uses audio signals to predict early signs of illness. In places where access to quality healthcare is difficult, this technology could substitute, with users needing nothing more than their smartphone’s microphone.


How does Google's bioacoustics AI work?


Google's bioacoustics AI model is called HEAR (Heath Acoustic Representations). It was trained on 300 million two-second audio samples, including coughs, colds, sneezes, and breathing patterns, 100 million of which are cough sounds.

Attachments

العودة إلى الأعلى