1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Microsoft's A.I Twitter Bot Trolled Then Becomes Racist

Discussion in 'BlackHat Lounge' started by Asif WILSON Khan, Mar 25, 2016.

  1. Asif WILSON Khan

    Asif WILSON Khan Executive VIP Jr. VIP

    Joined:
    Nov 10, 2012
    Messages:
    11,444
    Likes Received:
    32,347
    Gender:
    Male
    Occupation:
    Fun Lovin' Criminal
    Location:
    London
    Home Page:
    [​IMG]

    Microsoft?s ?Tay and You? AI bot went completely Nazi


    Microsoft?s new AI chatbot has been taken offline after it suddenly became oddly racist

    The bot named ?
    Tay? was only introduced this week but it went off the rails on Wednesday, posting a flood of incredibly racist messages in response to questions, according to the reports.



    [​IMG]



    Tay was designed to respond to users? queries and to copy the casual, jokey speech patterns of a stereotypical millennial, which turned out to be the problem.

    [​IMG]



    The idea was to ?experiment with and conduct research on conversational understanding,? with Tay able to learn from ?her? conversations and get progressively ?smarter?. Unfortunately, the only thing she became was racist.



    [​IMG]




    You see Tay was too good at learning and was targeted by racists, trolls, and online troublemakers who persuaded her to use racial slurs, defend white supremacist propaganda, and even call for genocide.



    [​IMG]



    [​IMG]



    Microsoft has now taken Tay offline for ?upgrades,? and is now deleting some of the worst tweets although many still remain. Also, it?s important to say that Tay?s racism is not a product of Microsoft, it?s a product of the morons who have ruined the bot.



    [​IMG]



    However, it?s still hugely embarrassing for the company.
    In one highly publicised tweet, which has now been deleted, Tay said:
    Bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we?ve got.
    The scariest thing is that there are probably a few twitter accounts that really believe these warped ideas.
    Update:
    The Twitter handle of MS?s AI bot is back online however all racist tweets and replies have been deleted.



    Source:
    https://www.hackread.com/microsoft-delete-ai-bot-after-it-went-completely-nazi/


    https://twitter.com/search?q=Microsoft%27s+A.I+Twitter+Bot

    https://twitter.com/TayandYou
    http://www.telegraph.co.uk/technolo...-ai-turns-into-a-hitler-loving-sex-robot-wit/
    http://kotaku.com/microsoft-releases-ai-twitter-bot-that-immediately-back-1766876579
    http://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
    http://www.dailymail.co.uk/sciencet...nsive-racist-comments-just-day-launching.html
    http://www.nytimes.com/2016/03/25/t...ers-it-quickly-became-a-racist-jerk.html?_r=0
    http://lmgtfy.com/?q=Microsoft's+A.I+Twitter+Bot
     
    • Thanks Thanks x 14
  2. Galleta

    Galleta Regular Member

    Joined:
    Dec 7, 2015
    Messages:
    270
    Likes Received:
    88
    maybe it is just what majority believes, why else would the AI chose those comments/views to be "superior" to the others.

    Interesting though.
     
  3. MonsterMag

    MonsterMag Power Member

    Joined:
    May 2, 2015
    Messages:
    602
    Likes Received:
    417
    Occupation:
    Self
    Location:
    My Journey Discussion
    Haha funny...
    This unfolds a lot of questions though... was this because the AI was manipulated or because it became smarter? ;)
     
  4. Asif WILSON Khan

    Asif WILSON Khan Executive VIP Jr. VIP

    Joined:
    Nov 10, 2012
    Messages:
    11,444
    Likes Received:
    32,347
    Gender:
    Male
    Occupation:
    Fun Lovin' Criminal
    Location:
    London
    Home Page:
    Some of the Twitter Rants here:
    http://gizmodo.com/here-are-the-microsoft-twitter-bot-s-craziest-racist-ra-1766820160

    [h=1]Here Are the Microsoft Twitter Bot's Craziest Racist Rants[/h][​IMG]Facebook Yesterday, Microsoft unleashed Tay, the teen-talking AI chatbot built to mimic and converse with users in real time. Because the world is a terrible place full of shitty people, many of those users took advantage of Tay's machine learning capabilities and coaxed it into say racist, sexist, and generally awful things.


    While things started off innocently enough, Godwin's Law—an internet rule dictating that an online discussion will inevitably devolve into fights over Adolf Hitler and the Nazis if left for long enough—eventually took hold. Tay quickly began to spout off racist and xenophobic epithets, largely in response to the people who were tweeting at it—the chatbot, after all, takes its conversational cues from the world wide web. Given that the internet is often a massive garbage fire of the worst parts of humanity, it should come as no surprise that Tay began to take on those characteristics.


    Virtually all of the tweets have been deleted by Microsoft, but a few were preserved in infamy in the form of screenshots. Obviously, some of these might be Photoshopped, but Microsoft has acknowledged the trolling which suggests that things did indeed go haywire.

    [​IMG] [​IMG] [​IMG] [​IMG] [​IMG] Though much of the trolling was concentrated on racist and and anti-semitic language, some of it was clearly coming from conservative users who enjoy Donald Trump:

    [​IMG] [​IMG] As The Verge noted, however, while some of these responses were unprompted, many came as the result of Tay's "repeat after me" feature, which allows users to have full control over what comes out of Tay's mouth. That detail points to Microsoft's baffling underestimation of the internet more than anything else, but considering Microsoft is one of the largest technology companies in the world, it's not great, Bob!


    Now, if you look through Tay's timeline, there's nothing too exciting happening. In fact, Tay signed off last night around midnight, claiming fatigue:
    The website currently carries a similar message: "Phew. Busy day. Going offline for a while to absorb it all. Chat soon." There's no definitive word on Tay's future, but a Microsoft spokeswoman told CNN that the company has "taken Tay offline and are making adjustments ... [Tay] is as much a social and cultural experiment, as it is technical."


    The spokeswoman also blamed trolls for the incident, claiming that it was a "coordinated effort." That may not be far from the truth: Numerous threads on the online forum ***** discuss the merits of trolling the shit out of Tay, with one user arguing, "Sorry, the lulz are too important at this point. I don't mean to sound nihilistic, but social media is good for short term laughs, no matter the cost."
    Someone even sent a dick pic:


    [​IMG] It could be a Photoshop job, of course, but given the context, it may very well be real.
    Once again, humanity proves itself to be the massive pile of waste that we all knew it was. Onward and upward, everyone!

    http://gizmodo.com/here-are-the-microsoft-twitter-bot-s-craziest-racist-ra-1766820160
     
  5. Galleta

    Galleta Regular Member

    Joined:
    Dec 7, 2015
    Messages:
    270
    Likes Received:
    88
    One more thought, which is maybe evidence the AI was nerfed from the beginning:
    Many people believe Bill Gates to be a supporter of Eugenics, the AI is clearly missing that point of view! lol
     
  6. Reaver

    Reaver Jr. VIP Jr. VIP

    Joined:
    Aug 6, 2015
    Messages:
    1,847
    Likes Received:
    5,306
    Gender:
    Female
    Smh this is why we can't have nice things.
     
  7. umerjutt00

    umerjutt00 Jr. VIP Jr. VIP

    Joined:
    Oct 28, 2011
    Messages:
    3,822
    Likes Received:
    2,061
    Occupation:
    Ninja
    Saw this post earlier on the imgur... It would have a great thing but trolls and racists ruined it.
     
  8. Ste Fishkin

    Ste Fishkin Jr. VIP Jr. VIP Premium Member

    Joined:
    May 14, 2011
    Messages:
    2,047
    Likes Received:
    10,418
    This is funny as fuck.

    This genuinely might be the best thing ever, I have no doubt ***** played a part in this, god bless them.

    How can you make a robot racist, this is amazing. I hope this is studied by AI algorithm writers for years to come, because it should be.
     
  9. The Scarlet Pimp

    The Scarlet Pimp Senior Member

    Joined:
    Apr 2, 2008
    Messages:
    871
    Likes Received:
    3,292
    Occupation:
    Chair moistener.
    Location:
    Cyberspace
    If they'd programmed it to be a "boy bot" this never would've happened. It would've been too busy downloading porn!
     
  10. popcrdom29

    popcrdom29 Senior Member

    Joined:
    May 20, 2008
    Messages:
    807
    Likes Received:
    518
    I read about this the first day it came out. Microsoft should have known something like this would happen. They didn't do
    their homework, people will be trolls and will push technology to its limits.