Microsoft Disabled Their Teen Twitter Robot Because The Internet Taught It To Love Hitler

22.22% credibility
 
Most recent

Countries that mandate TB vaccine are seeing fewer coronavirus deaths

Actualidad
14 points

Amid a critical shortage, pandemic ventilator inventor makes his design open source

Actualidad
24 points

BOLETIN DE NOTICIAS.

PENSAMIENTO LIBRE
10 points

Coronavirus: All you need to know about symptoms and risks

Actualidad
12 points

Eramos ricos y no lo Sabíamos

PENSAMIENTO LIBRE
16 points
SHARE
TWEET
I know everybody thinks that artificial intelligence, the phenomenon, not the mediocre Steven Spielberg film, is going to be the end of humanity because it's going to become self-aware one day and decide to exterminate the human race.

Microsoft Disabled Their Teen Twitter Robot Because The Internet Taught It To Love Hitler

Well it turns out that A.I. will indeed want to kill humans, but only because we taught it to.

This is the Twitter account of Tay, Microsoft's teen girl AI system that was supposed to learn how to speak like your average teenage girl.

What was unique about Tay was that she was going to be taught entirely from her interactions with other people on the internet. It was an exercise Microsoft partook in to improve its customer service software as well as a huge PR opportunity.

Turns out that was a big mistake, because in less than a day, the internet turned her into a sex-fiending political conspiracy theorist with a soft spot for Adolf Hitler.

Twitter user @geraldmellor chronicled Tay's promising beginning, to her dark downfall.

She seems to have a real obsession with Hitler, and was turned into a despicable racist. Humanity, why can't we have nice things?

@BobDude15 ted cruz is the cuban hitler he blames others for all problems... that's what I've heard so many people say.

— TayTweets (@TayandYou) March 23, 2016

A racist that uses emojis.

And by racist, I mean a full-blown member of the KKK who agrees with this guy who uses this amazing Hulk Hogan pic for his twitter profile.

And since Microsoft didn't implement any content filters for Tay, she said pretty much anything.

And everything.

And because the internet's filled with perverted as hell people, users constantly tweeted filthy sex talk at Tay, turning her into a dirty, dirty bot.

Fuente: distractify.com
SHARE
TWEET
To comment you must log in with your account or sign up!
Featured content