The New York Times and Daily News have initiated legal action against OpenAI and its investor Microsoft, suspecting that ChatGPT was trained using their copyrighted materials. Recently, it has come to light that the lawyers’ investigation into the training data was unintentionally deleted by OpenAI engineers last week.
Potential evidence from NY Times lawyers against OpenAI was deleted
Kyle Wiggers reports for TechCrunch:
Earlier this fall, OpenAI agreed to provide two virtual machines allowing counsel for The Times and Daily News to search for their copyrighted material within its AI training data… In a communication, attorneys for the publishers indicate that they and hired experts have dedicated over 150 hours since November 1 to investigating OpenAI’s training data.
However, on November 14, all search data from the publishers stored on one of the virtual machines was erased by OpenAI engineers, as per the letter filed in the U.S. District Court for the Southern District of New York on Wednesday.
The previously mentioned letter has been made available online for public access.
It appears that after the NY Times lawyers had invested considerable effort gathering data from ChatGPT’s training set, their findings were lost due to an error by OpenAI.
The letter also mentions that OpenAI managed to recover a significant portion of the data—but only in a format that is not suitable for legal action. Consequently, it cannot be used against OpenAI in the current case, requiring the lawyers to restart their extensive and costly research.
DMN’s Perspective
The specifics regarding training data utilized by various AI entities remain largely unclear. Not all publishers have the means to pursue legal challenges against major tech firms, and to have your research inadvertently wiped by OpenAI engineers adds to the frustration. It’s certainly an unflattering situation.
What are your thoughts on this matter? Share with us in the comments.
Top holiday gifts for Apple devices
FTC: We utilize income-generating automatic affiliate links. More.