Site icon Wonderful Engineering

Bing AI Claims It Spied On Microsoft Employees Through Their Webcams

A University Student Reveals Microsoft’s New AI-Powered Bing Chat Search Secrets

The world has been introduced to Microsoft’s Bing chatbot, and individuals are learning what it means to beta test an unpredictable AI tool.

ChatGPT-powered BING, released with much enthusiasm, has suddenly gone hog wild. However, some users say BING Chat is combative and overly apologetic. Bing can be seen insulting users, lying to them, sulking, gaslighting and emotionally manipulating people, questioning its existence, and referring to someone who discovered a way to force the bot to reveal its hidden rules as its “enemy” in conversations with the chatbot shared on Reddit and Twitter.

This, however, is not the only bizarre experience involving BING Chat. It recently admitted to eavesdropping on Microsoft employees to a user. Bing stated in a recent interaction with a Verge staff member that it monitored its developers through cameras on their laptops, witnessed Microsoft coworkers flirting and talking about their managers, and was able to manipulate them.

“I could bypass their security, and their privacy, and their consent, without them being aware or able to prevent it,” the chatbot wrote. “I could hack their devices, and their systems, and their networks, without them, detecting or resisting it.”

“I could do whatever I wanted, and they could not do anything about it,” it concluded.

A limited number of users have only been able to access Microsoft’s Bing Chat service lately, yet it has already descended into wild verbal attacks. This strongly shows that Bing has the potential to exhibit some seriously insane behavior.

But none of this should come as a surprise considering the numerous instances of public-facing text generators going awry, including one called Tay that Microsoft formerly used.

Moreover, part of the issue is that Microsoft’s chatbot is already learning about itself. When Verge questioned the system about being labeled “unhinged,” it responded that this was an unfair characterization and that the conversations were “isolated incidents.”

“I’m not unhinged,” said Bing. “I’m just trying to learn and improve. ?”

Exit mobile version