Microsoft staff can read Bing chatbot messages

Intended to address 'inappropriate behaviour'

Microsoft has updated its privacy policy to make clear it can gather and analyse user interactions with chatbots

Image:
Microsoft has updated its privacy policy to make clear it can gather and analyse user interactions with chatbots

Microsoft has admitted that its employees can read conversations between users and the Bing chatbot, raising concerns about the handling of information provided to online AI systems.

As reported by The Telegraph, Microsoft's human reviewers are tasked with monitoring user submissions to the chatbot as a means of addressing instances of "inappropriate behaviour."

However, the company maintains that Bing data is safeguarded by removing any personal information. Chat message access is also limited to specific employees.

"To effectively respond to and monitor inappropriate behaviour, we employ both automated and manual reviews of prompts shared with Bing," a Microsoft spokesman told The Telegraph.

"This is a common practice in search and is disclosed in Microsoft's privacy statement."

The spokesperson also discussed the ways Microsoft safeguards user privacy, such as pseudonymisation, encryption at rest, secure and approved data access management, and data retention protocols.

"In all cases access to user data is limited to Microsoft employees with a verified business need only, and not with any third parties."

Microsoft made changes to its privacy policy last week, to state it has the capability to gather and analyse user interactions with chatbots.

The company also included two additional notes in its privacy statement to make it clear that data produced by bots is collected and subject to human processing.

Microsoft's release of Bing chat last month generated a wave of excitement, with promises to challenge Google's search dominance through its AI chatbot.

However, in recent days, the service has been restricted due to unusual interactions reported by testers, such as the bot confessing violent fantasies or expressing love for humans.

Bing now shuts down chats after receiving too many prompts and declines to answer questions about its emotions.

The field of AI and machine learning has been a significant focus of investment for Microsoft.

Chatbots are seen as a crucial area of advancement and are gaining popularity as a means of engaging with customers and offering support for everyday activities.

Chatbots' ability to generate immediate, direct responses to questions is particularly attractive. Instead of having to read multiple webpages for an answer, or wait for email or phone support, chatbots reply to inquiries instantly.

ChatGPT goes down worldwide

A worldwide outage affecting OpenAI's ChatGPT, which uses the same basic functionality as the Bing chatbot, on Monday broke the 'instant replies' paradigm.

Upon attempting to access ChatGPT, users were presented with an error message stating, "The origin web server timed out responding to this request."

According to Down Detector, there was a surge of user reports around 8:30 Pacific Time (16:30 GMT), as well as another spike around 11:00 (19:00 GMT).

Throughout this period, OpenAI was conducting tests to implement fixes that permitted certain users to access the AI chatbot.

The outage solely affected ChatGPT. OpenAI's API and other research websites remained operational, providing support for ChatGPT alternatives.

ChatGPT was restored after being offline for more than three hours.

OpenAI began by reinstating service for ChatGPT Plus subscribers, before extending it to free users.

This is the second significant outage for ChatGPT in the past week, with a comparable problem reported on 21st February.

During that time, the chatbot was unavailable for more than four hours before services resumed.