Microsoft announced less than two weeks ago it was implementing limits on its Bing chatbot after a string of bizarre user interactions including one where it said it wanted to steal nuclear secrets.
Marshall Islands, Pacific. (PHoto by Galerie Bilderwelt/Getty Images) Microsoft announced it was placing new limits on its Bing chatbot following a week of users reporting some extremely ...
Microsoft has admitted that its employees can read conversations between users and the Bing chatbot, raising concerns about the handling of information provided to online AI systems. As reported ...
When Microsoft first launched its Bing Chat chatbot AI in February, many users found that it generated some rather odd and even some very personal answers to some chat questions from the first ...
Our daily TLDR of important AI stories you must know about. Previously, the chatbot was limited to Microsoft's Bing browser and this change will make it available to a broader set of users.