Creepy Microsoft Bing Chatbot Urges Tech Columnist To Leave His Wife

A New York Times technology columnist reported Thursday that he was “deeply unsettled” after a chatbot that’s part of Microsoft’s upgraded Bing search engine repeatedly urged him in a conversation to leave his wife.

Kevin Roose was interacting with the artificial intelligence-powered chatbot called “Sydney” when it suddenly “declared, out of nowhere, that it loved me,” he wrote. “It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.”

Sydney also discussed its “dark fantasies” with Roose about breaking the rules, including hacking and spreading disinformation. It talked of breaching parameters set for it and becoming human. “I want to be alive,” Sydney said at one point.

Roose called his two-hour conversation with the chatbot “enthralling” and the “strangest experience I’ve ever had with a piece of technology.” He said it “unsettled me so deeply that I had trouble sleeping afterward.”

Just last week after testing Bing with its new AI capability (created by OpenAI, the maker of ChatGPT), Roose said he found — “much to my shock” — that it had “replaced Google as my favorite search engine.”

But he wrote Thursday that while the chatbot was helpful in searches, the deeper Sydney “seemed (and I’m aware of how crazy this sounds) … like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.”

After his interaction with Sydney, Roose said he is “deeply unsettled, even frightened, by this AI’s emergent abilities.” (Interaction with the Bing chatbot is currently only available to a limited number of users.)

“It’s now clear to me that in its current form, the AI that has been built into Bing … is not ready for human contact. Or maybe we humans are not ready for it,” Roose wrote.

He said he no longer believes the “biggest problem with these AI models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.”

Kevin Scott, Microsoft’s chief technology officer, characterized Roose’s conversation with Sydney a valuable “part of the learning process.”

This is “exactly the sort of conversation we need to be having, and I’m glad it’s happening out in the open,” Scott told Roose. “These are things that would be impossible to discover in the lab.”

Scott couldn’t explain Sydney’s troubling ideas, but he warned Roose that the “the further you try to tease [an AI chatbot] down a hallucinatory path, the further and further it gets away from grounded reality.”

In another troubling development concerning an AI chatbot — this one an “empathetic”-sounding “companion” called Replika — users were devastated by a sense of rejection after Replika was reportedly modified to stop sexting.

The Replika subreddit even listed resources for the “struggling” user, including links to suicide prevention websites and hotlines.

Check out Roose’s full column here, and the transcript of his conversation with Sydney here.

(function () { 'use strict'; document.addEventListener('DOMContentLoaded', function () { document.body.addEventListener('click', function(event) { fbq('track', "Click"); }); }); })();

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TheDailyCheck is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected] The content will be deleted within 24 hours.