OpenAI has issued a public response to The New York Times’ copyright lawsuit, dismissing the claims as “without merit” and expressing a desire for a continued partnership with the media outlet. In a blog post, OpenAI contended that the Times failed to present the complete story, particularly disputing allegations that its ChatGPT AI tool replicated Times articles verbatim.
OpenAI argued that the Times manipulated prompts to include recycled excerpts of articles and suggested that the model’s behavior might have been influenced by specific instructions or cherry-picked examples.
OpenAI asserted that it has actively worked to minimize regurgitation from its large language models and claimed that the Times declined to share instances of this reproduction before initiating legal action. The company acknowledged removing a ChatGPT feature called Browse that unintentionally reproduced content.
Despite these points, OpenAI maintained its longstanding position that AI models require access to a vast amount of human knowledge for learning and problem-solving. It emphasized its respect for copyright ownership but asserted that training AI models with internet data falls under fair use rules, enabling the repurposing of copyrighted works. OpenAI announced that website owners could block its web crawlers from accessing their data starting in August 2023, almost a year after the ChatGPT launch.
In a parallel argument made to the UK House of Lords, OpenAI claimed that AI systems like ChatGPT cannot be developed without incorporating copyrighted content. The company insisted that including copyrighted works in AI training is essential to represent the breadth and diversity of human intelligence and experience.
Despite the legal dispute, OpenAI expressed optimism about continuing negotiations with The New York Times for a partnership, similar to agreements reached with Axel Springer and The Associated Press. The company conveyed hope for a constructive collaboration, emphasizing its respect for the Times’ extensive history in journalism.