Like David Ancal opened video after video of diarrhea this year, it struck him: This is not what he expected to do for his Ph.D.
Ancale, a mechanical engineering student at Georgia Tech who studies fluid dynamics, is currently working to demystify the acoustics of urination, flatulence and diarrhea. His team is training AI to recognize and analyze the sound of every phenomenon in the bathroom; in fact, research suggests that tracking the flow of our excreta may benefit public health.
What’s new – Ancalle and Maia Gatlin, an aerospace engineer at the Georgia Tech Research Institute (GTRI), created a mechanical device loaded with pumps, nozzles and tubes designed to recreate the physics—and sounds—of human body function. They called it the Synthetic Human Acoustic Reproduction Tester (yes, SHART).
SHART is now preparing an artificial intelligence algorithm that could one day detect deadly diseases like cholera and stop an epidemic, according to a presentation at the American Physical Society’s annual Fluid Dynamics conference last week. Ancalle and Gatlin’s results have not yet been published in a peer-reviewed journal.
Here’s the backstory – Diarrheal diseases such as cholera kill 500,000 children a year, making them the third leading cause of child mortality in the world. “There is an outbreak and resurgence in Haiti as we speak,” Gatlin says. Increasing disease detection will aid treatment and prevent outbreaks, she explains.
Why does it matter – The goal is to combine the machine learning model with low-cost sensors and deploy them in regions prone to diarrheal disease outbreaks. “And as we classify those events, we can start to collect that data,” Gatlin says. “It can say, ‘Hey, we’re seeing an outbreak of a lot of diarrhea.’ Then we can start to quickly diagnose what’s going on in an area.”
What did they do – Until recently, Ancale didn’t think much of diarrhea. “Our initial focus in that first year was really on gas and urination,” he says. He and his colleagues were trying to relate the fart sound to the internal geometry of the rectum—abnormal changes could indicate cancer. “After discussing with gastroenterologists, we decided it would be a good way to try a non-invasive route.”
But the project soon expanded: Ancalle teamed up with GTRI researchers who were devising ways to passively detect outbreaks of gastrointestinal disease. Perhaps, they wondered, next-generation toilets could do more than collect excrement—they could also help alert communities to an outbreak.
This is where acoustics come into play. Audio is easier to remotely analyze than video or self-reporting and is less invasive or cumbersome than a medical examination. And the sounds of our exits — urination, flatulence, hard defecation, and diarrhea — are different. The team realized that a low-cost device and an AI algorithm could organize this toilet information.
They started by sorting publicly available audio and video into excrement, capturing the frequency spectrum of each and feeding it to a machine learning algorithm. Their AI then learns from all that doodoo data until it’s ready for SHART machine testing.
The SHART machine is several feet wide and has many nozzles and attachments. The team pumped water through the machine and recorded the sounds. They learned the physics behind the sound of each excretion and designed the device to simulate the same dynamics – tinkering with different plug-ins for each subsystem. “A lot of thought goes into each of the sounds,” says Gatlin. “There was a subsystem for every sound on that little machine.”
“It’s actually performing pretty well,” she continues. Their algorithm identified the correct “excretion event” up to 98 percent of the time, according to early data.
The team is also exploring the fundamental physics at play. In the conference presentation, Ancalle described how the team modeled the sound of male urination (streams turning into droplets that spray in sequence.)
If the geometry of the urethra changes, the stream and sound change. Ancalle is now working with urologists to use the same machine learning approach to detect irregular changes in urination and flatulence based on this idea.
“Self-reporting is not very reliable,” says Ankal. “We’re trying to find a non-invasive way for people to be notified whether they should go for a screening or not.” Like, “Hey, your urine isn’t flowing as fast as it should.” Your farts don’t sound right. You should check it out. They hypothesize that changes in the tract—from cancer or another condition—will show up in these acoustics.
“It’s reasonable to assume you could detect it with microphones,” said Jared Barber, an applied mathematician at Indiana University who led the session but was not involved in the research. Ankal also worked on a model of female urination, but only completed the male model in time for his presentation.
What next – The researchers are looking to expand their tests and eventually create a deployable device that could include a small Raspberry Pi computer. Gatlin envisions pairing this project with ongoing sustainable toilet projects.
Barber notes that the work is very preliminary, but he was encouraged by the conversation. “It looks like it could have a very big impact,” Barber says. “Everything seems feasible. They use techniques that can provide the hopeful ability of diagnosis.
It’s still early days, but the team is designing with the end product in mind. “We’re not trying to offer million-dollar equipment,” says Ankal. “We’re trying to make this something that everyone can afford, especially since the project is focused on urban areas with weak health systems. The accessibility aspect is very important to us.”