Microsoft unveils more secure AI-powered Bing Chat for businesses to ensure ‘data doesn’t leak’

Microsoft unveils more secure AI-powered Bing Chat for businesses to ensure ‘data doesn’t leak’

Microsoft on Tuesday announced a more secure version of its AI-powered Bing specifically for businesses and designed to assure professionals they can safely share potentially sensitive information with a chatbot.

With Bing Chat Enterprise, the user’s chat data will not be saved, sent to Microsoft’s servers or used to train the AI models, according to the company.

“What this [update] means is your data doesn’t leak outside the organization,” Yusuf Mehdi, Microsoft’s vice president and consumer chief marketing officer, told CNN in an interview. “We don’t co-mingle your data with web data, and we don’t save it without your permission. So no data gets saved on the servers, and we don’t use any of your data chats to train the AI models.”

Since ChatGPT launched late last year, a new crop of powerful AI tools has offered the promise of making workers more productive. But in recent months, some businesses such as JPMorgan Chase banned the use of ChatGPT among its employees, citing security and privacy concerns. Other large companies have reportedly taken similar steps over concerns around sharing confidential information with AI chatbots.

In April, regulators in Italy issued a temporary ban on ChatGPT in the country after OpenAI disclosed a bug that allowed some users to see the subject lines from other users’ chat histories. The same bug, now fixed, also made it possible “for some users to see another active user’s first and last name, email address, payment address, the last four digits (only) of a credit card number, and credit card expiration date,” OpenAI said in a blog post at the time.

Like other tech companies, Microsoft is racing to develop and deploy a range of AI-powered tools for consumers and professionals amid widespread investor enthusiasm for the new technology. Microsoft also said Tuesday that it will add visual searches to its existing AI-powered Bing Chat tool. And the company said the Microsoft 365 Co-pilot, its previously announced AI-powered tool that helps edit, summarize, create and compare documents across its various products, will cost $30 a month for each user.

Bing Chat Enterprise will be free for all of its 160 million Microsoft 365 subscribers starting on Tuesday, if a company’s IT department manually turns on the tool. After 30 days, however, Microsoft will roll out access to all users by default; subscribed businesses can disable the tool if they so choose.

Rethinking AI chatbots for the workplace

Current conversational AI tools such as the consumer version of Bing Chat send data from personal chats to their servers to train and improve its AI model.

Microsoft, which uses OpenAI's technology to power its Bing chat, said workers can now have "complete confidence" their data "won't be leaked" outside of the company. - Microsoft
Microsoft, which uses OpenAI's technology to power its Bing chat, said workers can now have "complete confidence" their data "won't be leaked" outside of the company. - Microsoft

Microsoft’s new enterprise option is identical to the consumer version of Bing but it will not recall conversations with users, so they’ll need to go back and start from scratch each time. (Bing recently started to enable saved chats on its consumer chat model.)

With these changes, Microsoft, which uses OpenAI’s technology to power its Bing chat tool, said workers can have “complete confidence” their data “won’t be leaked outside of the organization.”

To access the tool, a user will sign into the Bing browser with their work credentials and the system will automatically detect the account and put it into a protected mode, according to Microsoft. Above the “ask me anything” bar reads: “Your personal and company data are protected in this chat.”

In a demo video shown to CNN ahead of its launch, Microsoft showed how a user could type confidential details into Bing Chat Enterprise, such as an someone sharing financial information as part of preparing a bid to buy a building. With the new tool, the user could ask Bing Chat to create a table to compare the property to other neighboring buildings and write an analysis that highlights the strengths and weaknesses of their bid relative to other local bids.

In addition to trying to ease privacy and security concerns around AI in the workplace, Mehdi also addressed the problem of factual errors. To reduce the possibility of inaccuracies or “hallucinations,” as some in the industry call it, he suggested users write clear, better prompts and check the included citations.

For more CNN news and newsletters create an account at CNN.com

Advertisement