Uh-oh — Microsoft might be eroticism, death, spirituality by bataillestoring information from your Bing chats.
This is probably totally fine as long as you've never chatted about anything you wouldn't want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.
In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: "As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service."
According to the Register's readingof a new clause "AI Services" in Microsoft's terms of service, Microsoft can store your conversations with Bing if you're not an enterprise user — and we don't know for how long.
Microsoft did not immediately respond to a request for comment from Mashable, and a spokesperson from Microsoft declined to comment to the Register about how long it will store user inputs.
"We regularly update our terms of service to better reflect our products and services," a representative said in a statement to the Register. "Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers."
Beyond storing data, there were four additional policies in the new AI Services clause. Users cannot use the AI service to "discover any underlying components of the models, algorithms, and systems." Users are not allowed to extract data from the AI services. Users cannot use the AI services to "create, train, or improve (directly or indirectly) any other AI service." And finally, users are "solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services)."
So maybe be a bit more careful while using Microsoft Bing chats or switch to Bing Enterprise Chat mode — Microsoft said in July that it doesn't save those conversations.
Topics Artificial Intelligence Microsoft
Ben Smith reveals why BuzzFeed published the 'explosive' Trump reportsThis boot has been recalled after Redditors found swastika prints on the soleBush sisters' heartfelt letter to the Obama girls will make your dayKim Kardashian shared beauty tips in her first postGet a free ticket to see 'M3GAN 2.0' when you buy one at FandangoThe BBC is launching a live 'Sherlock' mystery for you to solve on TwitterWelcome to the jazzy 'Bad and Boujee' remix the internet has neededNYC toilets are rated by poop emoji on this InstagramJulian Assange ducks the tough questions in video Reddit AMAThe most devoted fitness fanatics can now squeeze in a workout before their flight 'Suzume' review: Makoto Shinkai's healing journey is stunningly personal Elon Musk wants to challenge ChatGPT with his own AI company How to stay safe on dating apps like Tinder and Bumble How to protect yourself online Snapchat's AI chatbot is rolling out to all users globally 5 fantasy TV adaptations we'd rather see than a 'Harry Potter' reboot 'Quordle' today: See each 'Quordle' answer and hints for April 17 Conservative social media platform Parler acquired and then immediately shut down by new owner 'Succession' Season 4, episode 4: What does Shiv's pregnancy actually mean? Spotify goes down worldwide
0.1543s , 10040.5 kb
Copyright © 2025 Powered by 【eroticism, death, spirituality by bataille】Enter to watch online.Microsoft might be saving your conversations with Bing Chat,Global Perspective Monitoring