Senate authors of the Open App Markets Act made the bill worse by removing a “digital safety” clause that allows platforms to handle content moderation violations, the Computer & Communications Industry Association said in a statement Friday (see 2207190043). “Protecting digital safety is a justification companies could use for denying an app for violations of existing content moderation terms of service regarding hate speech, safety, and misinformation,” said CCIA. President Matt Schruers called the change a “deliberate effort to limit content moderation efforts that companies use to eliminate hate speech and misinformation to keep devices safe.”
Europe intends to put people at the heart of the digital transformation, officials said Thursday. European Commission President Ursula von der Leyen, European Parliament President Roberta Metsola and Prime Minister Petr Fiala of the Czech Republic, which currently holds the Council presidency, signed the European Declaration on Digital Rights and Principles proposed by the EC in January. They said the six rights and principles will mean affordable, high-speed connectivity for everyone everywhere; well-equipped classrooms and digitally skilled teachers; seamless access to online public services; a safe digital environment for children; the right to disconnect after working hours; access to easy-to-understand information on the environmental impact of digital products; and ability to control how personal data is used and shared.
The U.S. now ensures an adequate level of personal data protection for trans-Atlantic data flows, the European Commission said Tuesday. It published a draft adequacy decision that it said would resolve the concerns of the European Court of Justice in Schrems II. The decision will now be vetted by the European Data Protection Board, EU governments and the European Parliament. It would require U.S. companies to commit to complying with a detailed set of privacy conditions, such as deleting personal data when it's no longer needed for the purpose for which it was collected, and ensuring that privacy protection continues when personal information is shared with third parties, the EC said. EU citizens would have several avenues for redress if their data is mishandled. U.S. laws provide limitations and safeguards on access to data by public authorities, such as for criminal law enforcement and national security purposes, including new rules introduced by an executive order that addressed issues raised in Schrems II. European companies would be able to rely on these safeguards for data transfers as well as when using other mechanisms such as standard contractual clauses and binding corporate rules. Once the proposed regime has been vetted, the EC will finalize an adequacy decision that will be subject to periodic review. The Computer and Communications Industry Association cheered the development, but warned "legal uncertainty will continue to persist for companies as long as today's draft decision has not been formally approved by EU Member States." It urged governments to "end the two-year impasse as soon as possible."
Facebook, Instagram and WhatsApp face more EU privacy fines. The European Data Protection Board issued binding dispute resolution decisions Tuesday in three cases involving the Meta companies. The decisions resolved differing opinions among various data protection authorities about whether the processing of personal data for the performance of a contract is a sound legal basis for behavioral advertising (Facebook and Instagram) or for service improvement (WhatsApp). It's now up to the Irish Data Protection Commission, which has lead jurisdiction, to adopt the EDPB decisions, after which they will be made public, the board said.
The FTC is seeking comments on a petition from some 20 groups advocating for a rule prohibiting internet services from using certain types of engagement-optimization practices on anyone under 18, the agency said Friday in a Federal Register notice. Comments are due Jan. 3 in FTC-2022-0073 docket. The groups include Center for Digital Democracy, Fairplay, Berkeley Media Studies Group, Center for Humane Technology, Children and Screens, Electronic Privacy Information Center and Public Citizen.
Meta violated EU privacy law by enabling automated "data scraping" of personal information, an Irish Data Protection Commission (DPC) investigation found. The inquiry launched in 2021 based on media reports of the discovery of a collated dataset of Facebook personal data on the internet. The DPC examined Facebook search, Facebook Messenger contact importer and Instagram contact importer tools about processing Meta carried out between May 2018 and September 2019. The main issues involved whether the company complied with the EU general data protection regulation's requirement for data protection by design and default, said a Monday news release. The decision, backed by all other EU data protection supervisory authorities, requires Meta to bring its personal data processing into compliance and to pay a $275 million (265 million euro) fine. A Meta spokesperson stressed the DPC didn't say the incident constituted a personal data breach, hack or security failing. Meta is cooperating fully and "made changes to our systems during the time in question, including removing the ability to scrape our features in this way using phone numbers," he said: The company is "reviewing this decision carefully."
Ireland’s Data Protection Commission is investigating TikTok’s data practices and data transfers to China, European Commission President Ursula von der Leyen confirmed in a letter Monday. FCC Commissioner Brendan Carr drew attention to the letter Tuesday. The DPC is investigating the company’s potential noncompliance with the general data protection regulation (GDPR), she said. She cited “several ongoing proceedings” involving data transfers to China, the processing of minors’ data and “litigation before the Dutch courts (in particular concerning targeted advertising regarding minors and data transfers to China).” She wrote the letter in response to members asking about Chinese government authorities potentially accessing the data of EU citizens. The GDPR applies to situations in which a company in the EU allows access to personal data to an affiliated company outside the EU, she said: The first company must ensure such data transfers don’t compromise EU data protections, specifically when public authorities are involved, she said. A TikTok spokesperson cited a company statement from earlier this year, saying the investigation was initiated in September 2021: "While we can't comment on an ongoing investigation, we're continuing to fully cooperate with the DPC. We're constantly reviewing our policies, processes and technologies to ensure that our community continues to enjoy a safe and secure experience on TikTok."
The FTC should grant consumers the right to block data brokers from selling their personal information, California Attorney General Rob Bonta (D) commented Monday. The agency collected public comment through Monday on its Advanced NPRM for a potential privacy rulemaking (see 2211170072). Bonta said the agency should ban businesses and third-party online trackers from “tracking or selling the data of users" who have opted out of "commercial surveillance practices.” Bonta wants more stringent age verification for online services directed at children. Public Knowledge in comments with the Yale Law School Technology Accountability and Competition Project sought structural rules for data minimization and retention, new data security standards and artificial intelligence assessments that “test for efficacy and fairness.”
Congress should pass major antitrust legislation targeting the tech industry, nearly 50 advocacy groups wrote leadership in both chambers Wednesday. They want passage of the American Innovation and Choice Online Act (see 2208020001 and the Open App Markets Act (see 2204150040). Access Now, Public Knowledge, Center for Digital Democracy, Consumer Reports, Fight for the Future, Open Markets Institute and Public Citizen signed. They called the bills “common-sense, compromise legislation that have the support of a wide range of stakeholders.”
The Secret Service should update cybersecurity plans to reflect zero trust architecture guidance, the GAO said Tuesday. The GAO noted the Secret Service’s four-milestone cyber plan, which includes “assessing agency IT systems against federal guidance and implementing cloud services.” The plan was created before ZTA guidance was issued. ZTA "requires constant verification of everything that's trying to connect to an organization's IT systems," GAO said. The Department of Homeland Security responded on behalf of the Secret Service, concurring with the GAO recommendation.