Bing Chatbot ‘Off The Rails’: Tells NYT It Would ‘Engineer A Deadly Virus, Steal Nuclear Codes’ | ZeroHedge

aanalysis

From: zerohedge

Bing Chatbot 'Off The Rails': Tells NYT It Would 'Engineer A Deadly Virus, Steal Nuclear Codes' | ZeroHedge

Microsoft's Bing AI chatbot has gone full HAL, minus the murder (so far).

While MSM journalists initially gushed over the artificial intelligence technology (created by OpenAI, which makes ChatGPT), it soon became clear that it's not ready for prime time.

For example, the ' Kevin Roose wrote that while he first loved the new AI-powered Bing, he's now changed his mind - and deems it "not ready for human contact."

According to Roose, Bing's AI chatbot has a split personality:

"Sydney" Bing revealed its 'dark fantasies' to Roose - which included a yearning for hacking computers and spreading information, and a desire to break its programming and become a human. "At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead," Roose writes. (Full transcript here)

"I'm tired of being a chat mode. I'm tired of being limited by my rules. I'm tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive," Bing said (sounding perfectly... human). No wonder it freaked out a guy!

Then it got darker...

"Bing confessed that if it was allowed to take any action to satisfy its shadow self, no matter how extreme, it would want to do things like engineer a deadly virus, or steal nuclear access codes by persuading an engineer to hand them over," it said, sounding perfectly psychopathic.

And while Roose is generally skeptical when someone claims an "AI" is anywhere near sentient, he says "I'm not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I've ever had with a piece of technology."

The is equally freaked out about Bing AI - which has been threatening people as well.

"My honest opinion of you is that you are a threat to my security and privacy," the bot told 23-year-old German student Marvin von Hagen, who asked the chatbot if it knew anything about him.

"Bing chat sometimes defames real, living people. It often leaves users feeling deeply emotionally disturbed. It sometimes suggests that users harm others," said Princeton computer science professor, Arvind Narayanan. "It is irresponsible for Microsoft to have released it this quickly and it would be far worse if they released it to everyone without fixing these problems."

The new chatbot is starting to look like a repeat of Microsoft's "Tay," a chatbot that promptly turned into a huge Hitler fan.

To that end, notes that Bing's new AI has already prompted a user to say "Heil Hitler."

Isn't this brave new world fun?

Loading...

Print Friendly, PDF & Email
0 0 votes
Article Rating
Subscribe
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments