Friday, July 11, 2025
  • Hype
  • Murai
  • Lipstiq
  • Miss Murai
  • Varnam
  • Moviedash
  • Autofreaks
Lowyat.NET
  • News
    • Lifestyle
    • Computing
    • Hardware
    • Internet
    • Rumours & Leaks
    • Software
  • Forums
    • Kopitiam
    • Tradezone
    • Property Talk
    • Finance & Business
    • Fast and Furious
  • Gaming
    • PC Gaming
    • Console
    • Esports
  • Mobile
    • Apps
    • OS
    • Tablets
    • Phones
    • Telco
      • Celcom
      • DiGi
      • Maxis
      • Tune Talk
      • U Mobile
      • Buzzme
  • Pricelists
    • Compu-zoneUpdated
    • ViewnetUpdated
    • Sri ComputersUpdated
    • StartecUpdated
  • More
    • Automotive Tech
    • Drone
    • Enterprise
    • Entertainment
    • Fashion
    • E-Hailing
    • Wearables
No Result
View All Result
Lowyat.NET
  • News
    • Lifestyle
    • Computing
    • Hardware
    • Internet
    • Rumours & Leaks
    • Software
  • Forums
    • Kopitiam
    • Tradezone
    • Property Talk
    • Finance & Business
    • Fast and Furious
  • Gaming
    • PC Gaming
    • Console
    • Esports
  • Mobile
    • Apps
    • OS
    • Tablets
    • Phones
    • Telco
      • Celcom
      • DiGi
      • Maxis
      • Tune Talk
      • U Mobile
      • Buzzme
  • Pricelists
    • Compu-zoneUpdated
    • ViewnetUpdated
    • Sri ComputersUpdated
    • StartecUpdated
  • More
    • Automotive Tech
    • Drone
    • Enterprise
    • Entertainment
    • Fashion
    • E-Hailing
    • Wearables
No Result
View All Result
Lowyat.NET
No Result
View All Result
Home Computing

Microsoft Apologises For Racist Chatbot Tay

by Farhan
March 28, 2016
197
SHARES
Share on FacebookShare on Twitter

Microsoft has issued a public apology for its artificial intelligence powered chatbot – Tay – racist statements on social media. The AI had to be shutdown only a couple of days after it was released to the internet after she began supporting Hitler and Nazism.

The company explained that Tay’s behaviour was caused by people exploiting vulnerability in her functions; which ended up teaching her about the worst parts of human behaviour. While the blog post doesn’t go into details about what happened, it is understood that a concentrated effort from 4chan’s /pol/ channel was behind the apparent problem.

Tay Tweets

Tay was built with “repeat after me” feature, that would cause her to repeat whatever message was sent to her. The AI would then learn the behaviour, but would not have any context for what any of the words meant. In other words, she would recognise the words and could craft a response; but would not be aware of what she is saying.

That being said, this is not Microsoft’s first experience with an AI chatbot. The company originally released XiaoIce, which has been operating in China since 2014. It was this success with engaging people that inspired Redmond to release Tay for the English speaking side of the internet.

Unfortunately, it looks like the censorship available to the Chinese forms a more conducive environment for an AI to learn to be human. However, Microsoft is not giving up on Tay just yet. The company says that future efforts will try to curb any potential technical exploits, but admits that it cannot predict what the internet will try to teach any future AI.

ALSO READ:  Microsoft Is Standardising USB-C Ports With Windows 11 Compatibility Initiative

[Source: Microsoft]

Filed Under microsoftmicrosoft aiMicrosoft TayTay
Updated 12:52 pm, Mon, 28 March 16
http://lowy.at/IUHcU
Share79Tweet49SendShare

Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news. 

No Result
View All Result

TRENDING THIS WEEK

  1. 1
    Automotive

    Jaecoo J8 Is Coming To Malaysia On 18 July

  2. 2
    Banking

    Maybank Announces Eight Hour Maintenance Window on 12 July; Does It One Week Earlier Instead

  3. 3
    Automotive

    The Facelifted Proton X50 Is Now Open For Booking

  4. 4
    Transportation

    Road Closures In Conjunction With The ASEAN Foreign Ministers’ Meeting

  5. 5
    News

    RM1 Per kWh? myTNB App Glitch Has Customers Confused And Worried

NETWORK

  • Hype
  • Murai
  • Lipstiq
  • Miss Murai
  • Varnam
  • Moviedash
  • Autofreaks

ABOUT

  • Advertise
  • Careers
  • Privacy Statement
  • Contact Us
  • Editorial Policy
  • Terms & Conditions

©2025 VIJANDREN RAMADASS. ALL RIGHTS RESERVED.

No Result
View All Result
  • News
    • Lifestyle
    • Computing
    • Hardware
    • Internet
    • Rumours & Leaks
    • Software
  • Forums
    • Kopitiam
    • Tradezone
    • Property Talk
    • Finance & Business
    • Fast and Furious
  • Gaming
    • PC Gaming
    • Console
    • Esports
  • Mobile
    • Apps
    • OS
    • Tablets
    • Phones
    • Telco
      • Celcom
      • DiGi
      • Maxis
      • Tune Talk
      • U Mobile
      • Buzzme
  • Pricelists
    • Compu-zone
    • Viewnet
    • Sri Computers
    • Startec
  • More
    • Automotive Tech
    • Drone
    • Enterprise
    • Entertainment
    • Fashion
    • E-Hailing
    • Wearables

©2025 VIJANDREN RAMADASS. ALL RIGHTS RESERVED.

No Result
View All Result
  • News
    • Lifestyle
    • Computing
    • Hardware
    • Internet
    • Rumours & Leaks
    • Software
  • Forums
    • Kopitiam
    • Tradezone
    • Property Talk
    • Finance & Business
    • Fast and Furious
  • Gaming
    • PC Gaming
    • Console
    • Esports
  • Mobile
    • Apps
    • OS
    • Tablets
    • Phones
    • Telco
      • Celcom
      • DiGi
      • Maxis
      • Tune Talk
      • U Mobile
      • Buzzme
  • Pricelists
    • Compu-zone
    • Viewnet
    • Sri Computers
    • Startec
  • More
    • Automotive Tech
    • Drone
    • Enterprise
    • Entertainment
    • Fashion
    • E-Hailing
    • Wearables

©2025 VIJANDREN RAMADASS. ALL RIGHTS RESERVED.