Users have taken to social media to point out the striking similarity between Hollywood actress Scarlett Johansson and ChatGPT, a new voice option for OpenAI’s AI chatbot. Introduced as part of a recent upgrade intended to make ChatGPT’s responses more conversational and flirtatious, the voice, known as “Sky,” But worries were raised by the voice’s flirtatious quality and resemblance to Johansson’s performance of Samantha in the 2013 movie Her.
Users were quick to draw parallels, with many finding the voice uncomfortably close to Samantha’s. This raised questions about whether OpenAI deliberately mimicked Johansson’s voice, especially considering CEO Sam Altman’s suggestive reference to Her on social media during the demonstration. OpenAI vehemently denies any attempt at imitation, stating that the voices underwent a rigorous five-month selection process involving professional voice actors, talent agencies, and industry advisors. They emphasized their commitment to AI voices that avoid mimicking celebrities and prioritize a sense of timelessness and approachability.
This incident adds another layer to the ongoing debate about the ethical use of celebrity likenesses in AI development. In November, Johansson reportedly took legal action against an AI app that used her image in an advertisement without permission. OpenAI’s decision to pause the Sky voice suggests a growing awareness of these issues and a willingness to address user concerns.
However, the controversy extends beyond mere celebrity likeness. The portrayal of Sky as overtly flirty has drawn criticism for potentially reinforcing stereotypical gender roles in AI characters. Some users expressed concerns on social media platforms like the formerly known X, questioning why the AI voice felt obligated to be obsequious and overly flirtatious.
Although not yet accessible to all ChatGPT users, the enhanced voice functions will soon be made available to subscribers who have paid for access. The prospect for a two-tiered AI experience, where paying customers receive more “human-like” interactions, is brought up by this tiered access structure, which adds another level of complexity to the conversation.
OpenAI now faces the challenge of balancing innovative voice technology with ethical considerations and user expectations. The “Sky” incident serves as a stark reminder of the importance of responsible development and transparency in the field of AI.