A research group led by Osaka University has developed technology that allows androids to dynamically express mood states such as “excited” or “sleepy” by synthesizing facial movements with overlapping decay waves.
The appearance of the android in the photo is realistic enough to be mistaken for a human, but seeing it in action can be somewhat unsettling. They may show a variety of familiar facial expressions, such as smiling or frowning, but look for a consistent emotional state. behind Such expressions can be difficult, making us unsure of what we are truly feeling and creating anxiety.
Until now, the ‘patchwork method’ has been used, which allows robots that can move many parts of the face like androids to display facial expressions for a long time. The method involves preparing several pre-arranged motion scenarios such that unnatural facial movements are excluded while switching between these scenarios as needed.
However, this poses practical challenges, including preparing complex action scenarios in advance, minimizing noticeable unnatural movements during transitions, and fine-tuning movements to subtly control the facial expressions conveyed.
In this study, lead author Hisashi Ishihara’s research team developed a dynamic facial expression synthesis technology using ‘waveform movement’ to express various gestures that make up facial movements such as ‘breathing’, ‘blinking’, and ‘yawning’. Individual waves. These waves propagate and overlap with relevant facial regions, creating complex facial movements in real time. This method eliminates the need to prepare complex and diverse motion data and avoids noticeable motion transitions.
In addition, by introducing ‘waveform modulation’, which adjusts individual waveforms according to the robot’s internal state, changes in internal conditions such as mood can be immediately reflected as changes in facial movements.
“Advancing this research in dynamic facial expression synthesis will enable robots capable of complex facial movements to display more lifelike facial expressions and convey mood changes in response to their surroundings, including interactions with humans,” said senior author. Koichi Osuka said. “This could greatly enrich emotional communication between humans and robots.”
Ishihara added, “If we further develop a system that reflects internal emotions in every detail of an android’s behavior rather than just making superficial movements, we may be able to create an android that is recognized as having a heart.”
By implementing the ability to adaptively control and express emotions, it is expected to dramatically increase the value of communication robots by enabling them to exchange information with humans in a more natural and humane manner.