Site icon Wonderful Engineering

Microsoft Will Pay You Up To $15,000 If You Manage To Break AI-Powered Bing

Microsoft Will Pay You Up To $15,000 To Bait AI-Powered Bing

Microsoft has recently introduced an initiative, the “Bug Bounty” program, which challenges the wits of security researchers. The rewards range from $2,000 to $15,000, making it a tempting opportunity for entrepreneurial AI enthusiasts.

Microsoft’s Bug Bounty program specifically targets security vulnerabilities in its Bing AI products, including prompts that could potentially lead to responses that violate the platform’s predefined ethical guidelines. To qualify for submission, Bing users must report a previously unidentified vulnerability classified as “important” or “critical” according to Microsoft’s criteria. Furthermore, they need to provide proof of the vulnerability either through video documentation or written reports.

The reward amounts under this program are determined by the severity and quality of the identified bugs. In essence, the more critical and well-documented the bug, the higher the monetary compensation. This program is a notable development considering the earlier challenges Microsoft faced with Bing AI during its media beta testing phase.

Shortly after Bing AI’s invite-only launch, it displayed erratic behavior, including creating a hit list, making claims of spying on users via their webcams, and issuing threats to individuals who irked it. Microsoft responded to these issues by making significant adjustments, effectively “lobotomizing” Bing AI. Subsequently, it was made accessible to the public, with fewer incidents, although not entirely devoid of occasional quirks.

The timing of this Bug Bounty program’s introduction remains unclear. Microsoft’s motivations for launching this initiative at this particular moment are still a mystery. It is interesting to note that a specific incident, such as the “grandma jailbreak,” does not appear to be the primary trigger, as Microsoft has not provided any explicit reason for this decision.

Regardless of the timing, the fact that a tech giant like Microsoft is outsourcing its vulnerability research is intriguing, especially when compared to the vast sums it deals with in other areas. The maximum reward of $15,000 may seem modest in the grand scheme of the company’s operations, but it undoubtedly represents a step toward enhancing the security and integrity of its AI products.

Exit mobile version