Watch LIVE

Baby-Faced Artificial Intelligence Program Tries to Channel an Infant's Brain


The long term goal: "To make an electronic consciousness ... "

It's either the cute or creepy new face of artificial intelligence.

BabyX, a program designed by an animation expert who created digital faces for Avatar and King Kong, is a computer-driven simulation of how the human brain sends emotional signals to a digital face. According to the Daily Mail, researchers programmed a virtual baby "brain" to deduce what is good and bad; respond to certain commands; and use recognition tools allowing the chubby-cheeked to identify words and images.

BabyX Image source: Auckland University

The program's algorithm allows BabyX to adapt and respond -- smile, laugh or cry -- to human interactions, just like a real baby.

The virtual baby face was crafted from a real 20-month-old muse: little Francesca, the daughter of BabyX's creator, Mark Sagar.

"One of my early motivations for this was how do we get a character to animate itself," the 46-year-old CGI expert and Auckland University scientist said. But then, as the New Zealand Herald reported, Sagar's vision expanded: He took computational neuroscience models of how we presume the brain works, and linked them to biomechanically models of human faces.

The ultimate goal: to create a machine that thinks the same way we do — one that learns, dreams, adapts to its environment and develops a memory and a personality as it grows up — just like a human baby.

"That means almost giving it some sort of digital life force, so you're trying to put a soul into the machine," Sagar said.

The Daily Mail explains:

Inside BabyX’s virtual brain, simulated neurochemical reactions decide how she will act. She can ‘see’ via a camera on a computer and, using facial recognition, can tell when someone is interacting with her... she can also hear words and recognize sounds. For example, praising the baby will release virtual dopamine hormone, causing it to smile and giggle on screen.

At this stage, the system is still relatively reactive, based on analysis of faces and how certain situations affect the brain. But the goal is to move far beyond algorithmic responses.

"The really long-term goal of where we're going is to make an electronic consciousness," Sagar said.

(Image source: Vimeo) Image source: Vimeo

Auckland University said in a technical summary on its project website: "We are developing multidisciplinary technologies to create interactive autonomously animated systems which will define the next generation of human computer interaction and facial animation."

This is the kind of technology highlighted recently by movies like "Transcendence" and "Her." Check out the video below, then tell us: does it creep you out, or make you curious about future artificial intelligence possibilities?

(H/T: Daily Mail)


Follow Elizabeth Kreft (@elizabethakreft) on Twitter

Most recent
All Articles