Consumer Electronics Daily was a Warren News publication.
Mandatory Monitoring?

ECJ Facebook Ruling Called Potential Threat to Intermediary Liability Regime

Online platforms can be ordered to monitor for future sightings of content previously determined to be illegal, the European Court of Justice ruled Thursday in a case that could upend Europe's intermediary liability e-commerce directive. The high court said in Eva Glawischnig-Piesczek v. Facebook that EU members could order hosting companies to remove or block access to information "identical" or "equivalent" to content already declared illegal, including worldwide. Along with platform liability, ECJ raises issues of conflicts of law, free speech and mandatory content monitoring, representatives from the tech industry, digital rights activists, telecom sector and academia told us.

Glawischnig-Piesczek chaired the Austrian Greens party. In April 2016, a Facebook user shared on his personal page an article from Austrian online news magazine. The article generated a thumbnail of the original site that contained the title and brief summary of the story and a photo of Glawischnig-Piesczek. The content was accompanied by the user's comments that the politician was a "lousy traitor of the people" and/or a "corrupt oaf" and/or a member of a "fascist party," which the Austrian courts deemed harmful to her reputation and defamatory. When Facebook didn't remove the comments, Glawischnig-Piesczek sued for an injunction in the Commercial Court in Vienna, which ordered the platform stop to publishing photographs of the claimant if accompanying text contained the exact or equivalent information deemed defamatory. The case ended up in the Austrian Supreme Court, which asked the ECJ to clarify relevant EU law.

The e-commerce directive doesn't prevent national courts from ordering hosting providers to take down or block access to stored information whose content is identical to that previously declared to be unlawful, the ECJ said. Nor does it bar courts telling providers to remove information "equivalent" to that already declared unlawful as long as monitoring of and search for the information are limited to information that conveys a message essentially unchanged from the content that gave rise to the finding of illegality; and provided that the differences in the wording of the equivalent content, compared with the original, aren't such as to require host providers to independently assess the content. The high court said providers can also be ordered to remove the information globally "within the framework of the relevant international law."

The judgment "raises critical questions around freedom of expression and the role that internet companies should play in monitoring, interpreting and removing speech that might be illegal in any particular country," a Facebook spokesperson emailed. The ruling undermines the longstanding principle that one country doesn't have the right to impose its laws on speech on another country, Facebook said. It opens the door to forcing internet companies to proactively monitor content and then decide if it's "equivalent" to content found to be illegal. National courts will have to "set out very clear definitions on what 'identical' and 'equivalent' means in practice" to avoid a chilling effect on free speech, the spokesperson said.

"The territoriality issue is incredibly complex," emailed Winston Maxwell, Telecom Paris director-law and digital technology studies. Contrary to its approach on Google and the right to be forgotten (see 1909240004), the ECJ didn't clearly answer whether removal injunctions can have extraterritorial effect, saying instead that national courts must interpret international law. The right to be forgotten is "particularly tricky" because many countries don't recognize it and extraterritorial application would face conflict of law issues, but other kinds of illegal content, such as defamation, may encounter fewer conflicts, he noted. A message might be so manifestly defamatory that no country would recognize it as being protected by free speech, he said: "If that's the case, the risk of conflict with the law of other nations is low, and a national court might order an injunction with extraterritorial effect."

The judgment "could open the door for exploitative upload filters for all online content," said European Digital Rights Head of Policy Diego Naranjo. "Few hosting platforms, especially startups, will have the resources to implement elaborate monitoring systems, said Computer & Communications Industry Association Europe Senior Manager Victoria de Posson.

The judgment implies an automated system able to intercept illegal content and evaluate whether it's identical or equivalent, telecom consultant Innocenzo Genna blogged: That "huge development" could undermine the terms of the e-commerce directive's intermediary liability provisions. He said it will "influence the activity of the incoming European Commission," which will consider whether to revise the liability regime on online intermediaries and platforms.