That title is one of the most unique here on Inquirer Tech, but it refers to a real ongoing AI project. Believe it or not, Pennsylvania State University researchers developed an electronic tongue that will enable robots to taste food. Saptarshi Das, a PennState associate professor, says the device can detect sweet, salty, sour, bitter, and umami flavors.
This project isn’t only a weird attempt to make robots eat food. Instead, it is a more ambitious project that could help us create robots that act more like humans. After all, modern artificial intelligence has been trying to make machines like us, so why not make them feel, too? As a result, they could improve at serving our needs.
This article will discuss how the PennState AI tongue works. Later, I will discuss another project that uses artificial intelligence to give machines one of the five senses.
How does the AI tongue work?
The electronic tongue has two parts: a tongue and a gustatory cortex. The synthetic tastebuds have tiny, graphene-based electronic sensors called chemitransistors that can detect chemical or gas molecules.
The latter uses memtransistors, a transistor made of molybdenum disulfide that remembers past signals. Consequently, it forms an “electronic gustatory cortex” linking a physiology-drive “hunger neuron,” psychology-driven “appetite neuron,” and a “feeding circuit.”
For example, the AI tongue detects sodium ions when tasting salty food. “This means the device can ‘taste’ salt,” doctoral student Subir Ghosh explained.
The research team sees numerous applications for the AI tongue. For example, a robot with that component could recommend diets that are appetizing to humans because it can “taste” meals.
“The example I think of is people who train their tongue and become wine tasters. Perhaps in the future, we can have an AI system that you can train to be an even better wine taster,” Das stated in the PennState report.
The unique device could also help artificial intelligence programs develop weight loss plans. However, the team plans to broaden the electronic tongue’s range first.
You may also like: DragGAN AI Photo Editor went viral
“We are trying to make arrays of graphene devices to mimic the 10,000 or so taste receptors we have on our tongue that are each slightly different compared to the others, which enables us to distinguish between subtle differences in tastes,” Das said.
“We want to fabricate both the tongue part and the gustatory circuit in one chip to simplify it further,” Ghosh stated. “That will be our primary focus for the near future in our research.”
Also, they want to create more devices for other senses. “The circuits we have demonstrated were very simple, and we would like to increase the capacity of this system to explore other tastes,” graduate research assistant Andrew Pannone stated.
How does an AI nose work?
If the AI tongue seemed weird, here’s another strange tech project. Joel Mainland and his team created an AI nose because we lacked “information regarding sensory relationships concerning smell.”
Unlike the PennState study, Mainland doesn’t want to make robots act more like humans. Instead, he wants to know more about the molecular properties that make things emit specific odors.
That is why he and his colleagues created an AI model correlating a molecule’s smell with its molecular structure. Here’s how this artificial intelligence program works:
- They created a dataset containing the molecular makeup and olfactory traits of 5,000 recognized odorants.
- The scientists submitted it to an AI model for training.
- Next, its algorithms predict which odor words would best fit the molecule’s aroma.
- Monell Chemical Senses Center researchers conducted a blind validation survey to verify the model’s effectiveness. They gave 15 panelists 400 odorants and asked them to describe each by picking from 55 words like musty and mint.
You may also like: Scientists discovered the sixth taste
They discovered that the AI outperformed humans in examining 53% of the compounds examined. Also, it performed other olfactory tasks the researchers did not intend.
“The eye-opener was that we never trained it to learn odor strength, but it could nonetheless make accurate predictions,” told Mainland. Nevertheless, the researchers believed it would help the “world closer to digitizing odors to be recorded and reproduced.”
“It also may identify new odors for the fragrance and flavor industry that could not only decrease dependence on naturally sourced endangered plants but also identify new functional scents for such uses as mosquito repellent or malodor masking.”
Conclusion
Pennsylvania State University researchers created an artificial intelligence tool that lets robots taste food like people. Soon, it could expand new functions to future robots.
They could perform more roles previously reserved for humans, such as recommending food and planning dietary meal plans. Moreover, it could help bots understand our needs.
Check out their research paper on the Nature Communications website for more information. Also, satisfy that craving for more digital trends at Inquirer Tech.