Consumer Electronics Daily was a Warren News publication.
OpenAI ‘at Home’ in Ga.

Plaintiff Urges Atlanta Court to Deny OpenAI’s Motion to Dismiss His Defamation Complaint

Georgia courts have general jurisdiction over OpenAI, which “mistakenly argues” in its motion to dismiss plaintiff Mark Walters’ amended defamation complaint (see 2310160005) “that general jurisdiction only exists in the place of incorporation and the principal place of business,” said Walters’ opposition Monday (docket 1:23-cv-03122) in U.S. District Court for Northern Georgia in Atlanta. Walters, a nationally syndicated talk show host, alleges OpenAI’s ChatGPT service defamed him to a journalist.

OpenAI “has chosen to register to do business in Georgia as a foreign entity,” said Walters’ opposition. It also has a registered agent in Georgia and a registered agent address, it said. The Georgia Supreme Court has ruled that an entity that registers to do business in Georgia “is considered under Georgia law to be a resident of Georgia,” it said. OpenAI is thus “at home” in Georgia “for the purposes of general personal jurisdiction,” it said.

Walters “observes” that in its motion to dismiss, OpenAI “introduces a large amount of material” that’s outside the complaint, and therefore “outside the bounds of proper material for a 12(b)(6) motion,” said his opposition. OpenAI’s motion includes, for example, what it claims is a chat with its ChatGPT platform, it said. But Walters’ complaint doesn’t allege Walters ever had a chat with OpenAI, so its attempt to introduce such a chat “is way outside the pleadings” and shouldn’t be considered by the court, it said.

In all actions for defamation, “malice is inferred but may be rebutted to mitigate damages,” said Walters’ opposition. When defamation “is apparent from the writing itself,” a plaintiff “may recover without the necessity of pleading or proving special damages,” it said.

In the present case, OpenAI published to a journalist and ChatGPT user statements including allegations Walters “had committed various financial crimes and acts of dishonesty,” said his opposition. “They were therefore libelous per se,” it said: “Walters has adequately stated a claim for which relief may be granted under Georgia law.”

OpenAI argues its ChatGPT process “includes one or more disclaimers,” so the journalist “should have known the statements might not be true,” said Walters opposition. “There are multiple issues with this argument,” it said. Any of OpenAI’s disclaimers “are outside the pleadings or reasonably inferable from the pleadings,” so the court can’t consider them, it said. A disclaimer also doesn’t make “an otherwise libelous statement non-libelous,” it said.

Of OpenAI’s claims its ChatGPT statements to the journalist weren’t published, and therefore didn’t amount to defamation, OpenAI again “relies on matters outside the pleadings,” said Walters’ opposition. It argues its terms of use made clear that the journalist was the owner of the libelous material and if he republished the material, “he should inform his readers that he is responsible for the content of what he publishes,” it said.

It’s true that a republisher of libel can be responsible for what he republishes, it said. But that responsibility doesn’t “have the effect of negating the responsibility of the original publisher of the material,” in this case, OpenAI, said Walters’ opposition. All that’s required for publication “is communication of the libelous material to someone other than the subject,” it said. OpenAI’s ChatGPT statements to the journalist were communications to someone other than Walters, so the statements were published “for the purposes of Georgia law,” it said.

OpenAI “exhibited actual malice” toward Walters, qualifying Walters’ claims as defamatory, said the opposition. In the defamatory material, OpenAI “made numerous false statements about Walters,” it said. It falsely claimed that Walters is the defendant in a named federal lawsuit, and documents in federal civil actions “are readily available to the public” via Pacer, it said. OpenAI knew, “or had ready access to know,” that the civil action to which it referred “simply did not exist,” it said.