Microsoft Disabled Their Teen Twitter Robot Because The Internet Taught It To Love Hitler

22.22% credibility
 
Related

10 Vitamins women should take to prevent serious health problems!

Luffy News
380 points

This mom s punishment for her son is getting praise from all over the world

Luffy News
508 points



Most recent

Cinco hábitos poco conocidos para proteger los datos de su empresa de un ciberataque

Prensa
30 points

Portworx de Pure Storage extiende las capacidades de la plataforma para acelerar las cargas de trab

Patricia Amaya Comunicaciones
16 points

Magola López: Una Mujer Admirable

Carlos Eduardo Lagos Campos
112 points

Nuevos cambios de liderazgo en Thermo King

Tecnologia
6 points

LA INTELIGENCIA ARTIFICIAL ES TÓXICA Y MORTAL

Octavio Cruz Gonzalez
18 points

Fue peor la medicina que la enfermedad: El caso de la empresa Air-e, en la región Caribe colombiana.

Luis Horgelys Brito Ariza
40 points

SIBARITOSIS PARASITARIA

Octavio Cruz Gonzalez
10 points

Jairito Aguilar: Un Gobernador de Palabra y Compromiso Social

Luis Horgelys Brito Ariza
66 points

Teatrikando Benjamin Bernal La tiendita de los horrores, fenomenal estreno en el Hidalgo.

Benjamin Bernal
12 points

El Mejor Alcalde de La Guajira: ¿Un Título o una Realidad ?

Luis Horgelys Brito Ariza
300 points
SHARE
TWEET
I know everybody thinks that artificial intelligence, the phenomenon, not the mediocre Steven Spielberg film, is going to be the end of humanity because it's going to become self-aware one day and decide to exterminate the human race.

Microsoft Disabled Their Teen Twitter Robot Because The Internet Taught It To Love Hitler

Well it turns out that A.I. will indeed want to kill humans, but only because we taught it to.

This is the Twitter account of Tay, Microsoft's teen girl AI system that was supposed to learn how to speak like your average teenage girl.

What was unique about Tay was that she was going to be taught entirely from her interactions with other people on the internet. It was an exercise Microsoft partook in to improve its customer service software as well as a huge PR opportunity.

Turns out that was a big mistake, because in less than a day, the internet turned her into a sex-fiending political conspiracy theorist with a soft spot for Adolf Hitler.

Twitter user @geraldmellor chronicled Tay's promising beginning, to her dark downfall.

She seems to have a real obsession with Hitler, and was turned into a despicable racist. Humanity, why can't we have nice things?

@BobDude15 ted cruz is the cuban hitler he blames others for all problems... that's what I've heard so many people say.

— TayTweets (@TayandYou) March 23, 2016

A racist that uses emojis.

And by racist, I mean a full-blown member of the KKK who agrees with this guy who uses this amazing Hulk Hogan pic for his twitter profile.

And since Microsoft didn't implement any content filters for Tay, she said pretty much anything.

And everything.

And because the internet's filled with perverted as hell people, users constantly tweeted filthy sex talk at Tay, turning her into a dirty, dirty bot.

Fuente: distractify.com
SHARE
TWEET
To comment you must log in with your account or sign up!
Featured content