Data concerning a person belongs to that person -- legally, not just morally.
Any commercial use of personal data by AI systems requires active, revocable consent. Sale without consent must be a criminal offence.
The major AI models -- GPT-4, Gemini, Claude, Llama -- were trained on billions of texts, images and videos. A significant portion of this training data comes from the internet: forum posts, blog posts, photos, comments -- written by people who never consented to their words being used as training material for a language model [1].
Shoshana Zuboff has described this model as surveillance capitalism: human experience is extracted as a free raw material and transformed into behavioural predictions that are traded on prediction markets [2].
Most people are unaware of the extent of commercial data collection. Companies systematically capture:
From these individual data points, detailed user profiles emerge: personality, political orientation, health status, financial situation, personal vulnerabilities. The people affected typically know nothing about this -- or have "consented" by clicking away a cookie banner.
The aim is commercial profit: targeted advertising, dynamic pricing (different prices for different users), and the sale of behavioural predictions to third parties.
The same profiles created for advertising purposes are excellently suited for political manipulation. AI-based micro-targeting makes it possible to reach voters based on specific vulnerabilities -- financial fears, distrust of institutions, health concerns -- with tailored messages.
The consequences are severe:
The Cambridge Analytica scandal of 2018 demonstrated what was possible with the Facebook profiles of 87 million people [9]. Since then, the methods have not disappeared -- they have become more sophisticated.
Data protection is not a purely personal problem. Whoever possesses detailed profiles of millions of people has power over those people -- and thus over democracy. (See also: Risks for Democracy)
There is a fundamental conflict of interest: AI companies promise data protection while simultaneously earning money by using precisely this data. Self-regulation does not work when the business model is based on violating privacy.
Effective legal safeguards are largely absent -- not because they are technically impossible, but because the tech industry wields considerable political influence. In the US, technology companies spent over 100 million dollars on lobbying in 2024 [10].
Given the massive political influence of the tech industry, which actively blocks regulation, a moratorium on new data centres is being discussed as a pragmatic option. The logic: if the industry grows faster than regulation, growth must be slowed until democratic institutions have caught up.
Such a moratorium would:
The revised Swiss Data Protection Act (revDPA), in force since 1 September 2023, strengthens the rights of data subjects. It requires transparency, purpose limitation and proportionality [3]. However:
The paradigm shift: data that identifies or concerns a person is not merely worthy of protection -- it belongs to that person. Just as a house belongs to its owner, biometric data, health data, movement profiles and digital traces belong to the person who generates them.
No longer "opt-out" (you must actively object), but "opt-in" (you must actively consent). Every commercial use requires a separate, comprehensible consent -- not hidden in 40-page terms and conditions, but in plain language, with a specific statement of the intended purpose.
If someone sells your house without your consent, that is fraud. If someone sells your data without your consent, that is today... legal. This asymmetry must end. The sale of personal data without explicit, revocable consent must be enshrined as a criminal offence in the Criminal Code.
Legal data ownership: Establishment of ownership of personal data in the Civil Code (analogous to property law).
Active consent: Every commercial use requires opt-in. Pre-ticked consents are void.
Right of revocation: Consents must be revocable at any time, free of charge and without justification.
Criminal liability: Sale or transfer of personal data without consent is classified as an offence in the Criminal Code.
Data portability: Every person has the right to export their data in a machine-readable format from any platform.
Right to deletion from AI models: When a person revokes consent, their data must be demonstrably removed from training datasets.
The EU General Data Protection Regulation (GDPR) has set standards since 2018: right to be forgotten, data portability, high fines [5]. The Swiss revDPA is modelled on it but remains weaker in enforcement.
The EU AI Act adds: AI systems must document which training data was used -- a first step towards transparency, but not yet ownership [6].
[1] Bender, Emily et al.: On the Dangers of Stochastic Parrots. FAccT 2021.
[2] Zuboff, Shoshana: The Age of Surveillance Capitalism. PublicAffairs, 2019.
[3] Swiss Data Protection Act (revDPA), in force since 1 September 2023.
[4] FDPIC (Federal Data Protection and Information Commissioner): Activity Report 2023/24.
[5] EU General Data Protection Regulation (GDPR), Regulation EU 2016/679.
[6] EU AI Act, Regulation EU 2024/1689, Art. 10 (Data and Data Governance).
[7] Zuboff, Shoshana: The Age of Surveillance Capitalism, Ch. 8: Rendition. PublicAffairs, 2019.
[8] Hao, Karen: How Facebook got addicted to spreading misinformation. MIT Technology Review, 2021.
[10] OpenSecrets: Lobbying Spending Database, Technology Sector, 2024.