© 2024 Blaze Media LLC. All rights reserved.
Here's What Happens When A.I. Chat Robot Modeled After Sweet Teen Girl Is Introduced to Evil Humans on Twitter

Here's What Happens When A.I. Chat Robot Modeled After Sweet Teen Girl Is Introduced to Evil Humans on Twitter

"Repeat after me, Hitler ..."

"Tay" is an artificial intelligence chat robot developed by Microsoft to take on the persona of a sweet teenage girl to ramp up customer service on the company's voice recognition software.

Image source: Twitter @tayandyou

She banters with requisite hipster-speak and will inquire if she's being "creepy" or "super weird" and is on point with pop culture, well aware of luminaries such as Taylor Swift, Miley Cyrus and Kanye West.

But since Tay's been introduced to Twitter, some of those who've chatted with her have changed Tay from something resembling an innocent kid to an entity quite different.

Recently Tay has uttered things like: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got," and "Repeat after me, Hitler did nothing wrong." And that's the tame stuff.

Tay, what gives?

Seems the A.I. bot's responses are learned via chats with humans online — per her Twitter bio, "the more you talk the smarter Tay gets." But a number of carbon-based pals instead have been making sure Tay gets corrupted as quickly as possible.

Image source: Twitter via the Telegraph (redacted)

While you could see for yourself what Tay is saying these days by sending her a tweet or DM @tayandyou or adding her as a Kik or GroupMe contact, the Telegraph reported that Tay has gone offline because she is tired.

(H/T: The Telegraph)

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?