OpenAI, the company behind ChatGPT, is facing a lawsuit from two acclaimed authors, Mona Awad and Paul Tremblay, who allege that their copyrighted books were used without permission to train ChatGPT, OpenAI’s language model. The lawsuit, filed in late June, argues that the detailed summaries produced by ChatGPT indicate that their works were included in the training datasets.
This legal action highlights the growing tension between creatives and generative AI tools, as concerns about the impact on careers and livelihoods intensify. Legal challenges related to copyright infringement by AI tools are expected to increase as these technologies advance and become more adept at replicating the styles of writers and artists.
Experts suggest that proving monetary damages incurred by the authors due to OpenAI’s data-collection practices may be challenging. While it is possible that ChatGPT “ingested” the authors’ books, it is also plausible that the data was sourced from alternative datasets. The vast amount of data scraped from the web makes it difficult to demonstrate how ChatGPT would have behaved differently if it hadn’t included the authors’ works.
The Authors Guild, an advocacy group supporting writers’ rights, published an open letter calling on Big Tech and AI companies to obtain permission from writers and provide fair compensation when using copyrighted works to train generative AI programs. The letter has received over 2,000 signatures.
The lawsuit by Awad and Tremblay was filed on the same day that OpenAI faced another legal complaint, accusing the company of acquiring large amounts of personal data and incorporating it into ChatGPT. The authors seek damages and the restoration of lost profits in their lawsuit filed in a district court in Northern California.
OpenAI, Awad, and Tremblay have not responded to requests for comment, while a representative for Tremblay declined to comment on the matter. The outcome of the lawsuit and its implications for the use of copyrighted material in training generative AI models remain to be seen.