© 2024 Blaze Media LLC. All rights reserved.
Baby-Faced Artificial Intelligence Program Tries to Channel an Infant's Brain
August 27, 2014
The long term goal: "To make an electronic consciousness ... "
It's either the cute or creepy new face of artificial intelligence.
BabyX, a program designed by an animation expert who created digital faces for Avatar and King Kong, is a computer-driven simulation of how the human brain sends emotional signals to a digital face. According to the Daily Mail, researchers programmed a virtual baby "brain" to deduce what is good and bad; respond to certain commands; and use recognition tools allowing the chubby-cheeked to identify words and images.
Image source: Auckland University
The program's algorithm allows BabyX to adapt and respond -- smile, laugh or cry -- to human interactions, just like a real baby.
The virtual baby face was crafted from a real 20-month-old muse: little Francesca, the daughter of BabyX's creator, Mark Sagar.
"One of my early motivations for this was how do we get a character to animate itself," the 46-year-old CGI expert and Auckland University scientist said. But then, as the New Zealand Herald reported, Sagar's vision expanded: He took computational neuroscience models of how we presume the brain works, and linked them to biomechanically models of human faces.
The ultimate goal: to create a machine that thinks the same way we do — one that learns, dreams, adapts to its environment and develops a memory and a personality as it grows up — just like a human baby.
"That means almost giving it some sort of digital life force, so you're trying to put a soul into the machine," Sagar said.
The Daily Mail explains:
Inside BabyX’s virtual brain, simulated neurochemical reactions decide how she will act. She can ‘see’ via a camera on a computer and, using facial recognition, can tell when someone is interacting with her... she can also hear words and recognize sounds. For example, praising the baby will release virtual dopamine hormone, causing it to smile and giggle on screen.
At this stage, the system is still relatively reactive, based on analysis of faces and how certain situations affect the brain. But the goal is to move far beyond algorithmic responses.
"The really long-term goal of where we're going is to make an electronic consciousness," Sagar said.
Auckland University said in a technical summary on its project website: "We are developing multidisciplinary technologies to create interactive autonomously animated systems which will define the next generation of human computer interaction and facial animation."
This is the kind of technology highlighted recently by movies like "Transcendence" and "Her." Check out the video below, then tell us: does it creep you out, or make you curious about future artificial intelligence possibilities?
(H/T: Daily Mail)
--
Follow Elizabeth Kreft (@elizabethakreft) on Twitter
Want to leave a tip?
We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
more stories
Sign up for the Blaze newsletter
By signing up, you agree to our Privacy Policy and Terms of Use, and agree to receive content that may sometimes include advertisements. You may opt out at any time.
© 2024 Blaze Media LLC. All rights reserved.
Get the stories that matter most delivered directly to your inbox.
By signing up, you agree to our Privacy Policy and Terms of Use, and agree to receive content that may sometimes include advertisements. You may opt out at any time.