Victor Miller, 42, a citizen of Cheyenne, Wyoming, has made headlines for submitting paperwork to run for mayor in political campaigns. Miller’s campaign is unique, though; in place of him, he has suggested that a customized AI chatbot called VIC—short for “virtual integrated citizen”—should be listed on the ballot.
Miller says that VIC can make political decisions and run the city using OpenAI technologies. He thinks artificial intelligence (AI) can improve government. “AI has helped me in my life personally… such as helping me with my resume,” Miller shared with CNN. “I think it could add a layer to help a town. I want to see that happen.”
However, OpenAI quickly intervened, halting Miller’s access to the tool used to engage and persuade voters. The company stated that using ChatGPT for political campaigning violates its policies. “We’ve taken action against these uses of our technology for violating our policies against political campaigning,” an OpenAI spokesperson told CNN. The policy forbids any form of political campaigning or lobbying, including generating campaign materials aimed at particular demographic groups.
Miller was dissatisfied that he could not obtain city records because of an anonymous request, which inspired him to start VIC. He thought an AI with legal understanding could have handled this situation well. “If I were able to ask AI and interact with this new intelligence, it would have known the law, and I would have gotten the records,” he said.
Officials from the state and local governments, however, have doubts. Secretary of State of Wyoming Chuck Grey underlined that an AI is not a “qualified elector,” which is required for running for office. “Therefore, an AI bot is not a qualified elector,” Gray stated, suggesting that VIC might be a facade for Miller’s candidacy.
Miller is still using the AI through his personal ChatGPT account even after the public-facing version of VIC has been blocked. He intends to let voters engage with VIC at a nearby library and use its voice-to-text option to ask questions.
This is not just one such case. Additionally, OpenAI pursued legal action against a UK candidate who exploited their AI models for political gain. Using AI, Steve, a chatbot and candidate for parliament, Steve Endacott engaged voters and shaped policy based on their input. Endacott’s website is still up and running, but it no longer uses ChatGPT.
Experts are cautious about using AI to replace human judgment in governance. University of Maryland professor Jen Golbeck advises against letting AI make automated decisions. “AI has always been designed for decision support – it gives some data to help a human make decisions but is not set up to make decisions by itself,” she noted.
George Washington University assistant professor David Karpf agreed, calling AI political candidates a “gimmick” that shouldn’t be taken seriously. “ChatGPT is not qualified to run your government,” Karpf remarked, emphasizing that formal legislation around AI candidacies might be unnecessary given their lack of viability.
Although AI in politics creates promising possibilities, Golbeck and Karpf agree that humans should still be in charge of making decisions. Despite the skepticism, Miller maintains optimism over AI candidates’ prospects. “I think this can expand beyond the mayor and Parliament and [reach the whole] world,” he said, hoping his efforts inspire broader adoption of AI in political roles.