麻豆蜜桃精品无码视频-麻豆蜜臀-麻豆免费视频-麻豆免费网-麻豆免费网站-麻豆破解网站-麻豆人妻-麻豆视频传媒入口

Set as Homepage - Add to Favorites

【phim sex nhat co cot truyen】Microsoft's AI makes racist error and then publishes stories about it

Source:Global Perspective Monitoring Editor:focus Time:2025-07-03 09:35:53

Hey,phim sex nhat co cot truyen at least Microsoft's news-curating artificial intelligence doesn't have an ego. That much was made clear today after the company's news app highlighted Microsoft's most recent racist failure.

The inciting incident for this entire debacle appears to be Microsoft's late May decision to fire some human editors and journalists responsible for MSN.com and have its AI curate and aggregate stories for the site instead. Following that move, The Guardianreported earlier today that Microsoft's AI confused two members of the pop band Little Mix, who both happen to be women of color, in a republished story originally reported by The Independent. Then, after being called out by band member Jade Thirlwall for the screwup, the AI then published stories about its own failing.

So, to recap: Microsoft's AI made a racist error while aggregating another outlet's reporting, got called out for doing so, and then elevated the coverage of its own outing. Notably, this is after Microsoft's human employees were reportedly told to manually remove stories about the Little Mix incident from MSN.com.


You May Also Like

Still with me?

"This shit happens to @leighannepinnock and I ALL THE TIME that it's become a running joke," Thirlwall reportedly wrote in an Instagram story, which is no longer visible on her account, about the incident. "It offends me that you couldn't differentiate the two women of colour out of four members of a group … DO BETTER!"

As of the time of this writing, a quick search on the Microsoft News app shows at least one such story remains.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!
Mashable ImageA story from T-Break Tech covering the AI's failings as it appears on the Microsoft News app. Credit: screenshot / microsoft news app

Notably, Guardian editor Jim Waterson spotted several more examples before they were apparently pulled.

"Microsoft's artificial intelligence news app is now swamped with stories selected by the news robot about the news robot backfiring," he wrote on Twitter.

We reached out to Microsoft in an attempt to determine just what, exactly, the hell is going on over there. According to a company spokesperson, the problem is not one of AI gone wrong. No, of course not. It's not like machine learning has a long history of bias (oh, wait). Instead, the spokesperson insisted, the issue was simply that Microsoft's AI selected the wrong photo for the initial article in question.

"In testing a new feature to select an alternate image, rather than defaulting to the first photo, a different image on the page of the original article was paired with the headline of the piece," wrote the spokesperson in an email. "This made it erroneously appear as though the headline was a caption for the picture. As soon as we became aware of this issue, we immediately took action to resolve it, replaced the incorrect image and turned off this new feature."

Unfortunately, the spokesperson did not respond to our question about humanMicrosoft employees deleting coverage of the initial AI error from Microsoft's news platforms.

Microsoft has a troubled recent history when it comes to artificial intelligence and race. In 2016, the company released a social media chatbot dubbed Tay. In under a day, the chatbot began publishing racist statements. The company subsequently pulled Tay offline, attempted to release an updated version, and then had to pull it offline again.

As evidenced today by the ongoing debacle with its own news-curating AI, Microsoft still has some work to do — both in the artificial intelligence and not-being-racistdepartments.

Topics Artificial Intelligence Microsoft Racial Justice

0.1973s , 10089.6875 kb

Copyright © 2025 Powered by 【phim sex nhat co cot truyen】Microsoft's AI makes racist error and then publishes stories about it,Global Perspective Monitoring  

Sitemap

Top 主站蜘蛛池模板: 午夜国产精品无码 | 深夜男士观看网站 | 国产美女被糟在线观看 | 狠狠躁夜夜躁人人爽天天开心 | 精品国产乱码久久久人妻 | 人人妻人人爱手机勉费看片 | 三级在线观看中文字幕 | 91精品一区二区三 | 国产黄a三级三级三级看三级 | 国产亚洲综合一区在线 | 成人美女黄网站色大免费的 | 日韩人妻精品一区二区三区视频 | 欧美日韩不卡高清 | 国产欧美日韩精品砖区大长茎视频 | 日韩精品免费网站 | 一区二区高清不卡 | 好看的av在线不卡 | 国产精品第五页 | 性一交一乱一色一视频 | 国产精品精品国产 | 床戏无遮挡韩国 | 国产一级特黄大片在线观看 | 丰满熟女近親相 | 日韩欧美亚洲三区视频 | 欧洲丰满美熟女乱又伦av | 国产无套内射又大又 | 粉嫩AV一区二区三区粉 | 午夜影院和视费x看 | 久久三免级 | 护士三女三男一级毛片 | 成人无码区免费A∨直播网站 | 三级欧美日韩 | 无码一区二区毛片 | 九一影视——九一影视传媒有限公司 | 又黄又大又粗又长又刺激 | 国产在线卡一卡二卡三卡四卡免费 | 丰满人妻熟妇乱又伦精品软件 | 97人妻精品一区二区三区 | 亚洲AV片一区二区三区紫牛 | 自拍偷区亚洲综合激情 | 国产不卡高清视频在线观看 |