European Union

Privacy

Privacy by design

Harmful content

Data Sharing

Privacy by design

DMA vs GDPR vs CSAM

Conflicts

  • DMA would break existing privacy tools built to protect privacy by design and default

  • GDPR seeks to spur privacy by design and by default

  • CSAM Prop would require service providers to scan data and violate users’ privacy

  • Article 25 of Europe’s flagship GDPR privacy law requires that privacy is protected by design, and by default. It also includes requirements to obtain consent for the collection, use, and transfer of personal information. To comply, mobile app stores require that app providers state the terms of their use of the data collected, maintain the security of the app, and comply with applicable law. To the extent these obligations are not faithfully executed, they have to cure any deficiency, or potentially be removed from the official store.

     

    By contrast, Article 6(4) of the DMA would break existing privacy protections that were established by design and by default by requiring gatekeepers to allow users to circumvent official app stores and the very technology used to enforce key privacy protections, implement privacy nutrition labels, and keep harmful spyware off phones. Likewise Article 6(7) of the DMA could seriously risk user privacy by requiring third-party apps have full access to the sensitive core device features and hardware used to protect privacy.

     

    It seems impossible to square the circle between allowing unvetted apps onto devices and maintaining or ensuring the privacy of users. The DMA requirement to comply with other privacy laws such as the GDPR will create conflict with the unvetted app requirements of the DMA.

     

    For more information, read our analysis here.

Harmful content

DMA vs DSA

Conflicts

  • DMA in some cases undercuts DSA goals by preventing gatekeepers from comprehensive content based reviews of all apps.

  • DSA seeks to limit the dissemination of harmful content by establishing robust and comprehensive content gating systems to protect users.

  • Article 34, of the DSA requires Very Large Online Platforms, which includes major app stores, to among other things protect minors and consumers, and restrict election disinformation by mitigating the negative risks from harmful and other content, while also ensuring appropriate intellectual property protection.

     

    But Article 6(4) of the DMA  undercuts the very tools mobile platforms use to protect intellectual property, protect consumers against scams and frauds, and protect consumers and minors from objectionable and harmful content apps.  The official app stores vet apps for compliance with safety, security, and privacy obligations and other terms and conditions, but are now prevented from restricting third party apps based on their content.

     

    So for example, while DSA regulators have outlined stringent and specific steps that large online platforms must take to limit access to pornographic content, and regulators are taking action against a platform that failed to prevent minors from accessing pornography, the DMA at the same time is restricting a different large online platform in its efforts to protect minors from accessing pornography.

     

    In practical terms, an app that was taken down from Apple or Google’s app store for content reasons can now end run that decision and get back on the platform through a third-party app store that lacks the same obligations and intent. Because these third-party app stores aren’t captured under the DSA’s transparency database provisions, it may be impossible to know if apps that were taken down, stayed down.

Data Sharing

DMA vs GDPR

Conflicts

  • DMA impedes security standards and forces the sharing of data with third parties (including from countries of national security concern)

  • GDPR seeks to protect people’s privacy

  • Article 44 of the GDPR seeks to protect people’s privacy and prevent broad surveillance, including from other countries.

     

    However, the DMA impedes efforts to maintain existing privacy safeguards, as discussed elsewhere in this matrix. This would result in forcing companies to share data with third parties, including from countries of national security concern  that are not subject to the DMA’s rules.

     

    Under Article 6 (10) of the DMA, Gatekeepers are required to share access to commercial data generated on their platforms with “business users”, which would include competitors like Russia’s Yandex and others from countries of national security concern. Requiring companies that are primarily based in the US to disclose proprietary intellectual property, will unfairly benefit European as well as competitors from countries of national security concern. But there are also serious privacy and security concerns because of the close operational nature of the relationship between firms and governments in countries of national security concern.

     

    For more information see this report from CEPA.