Expert Analytical Association “Sovereignty”

Prohibition of access to social media for minors under the age of 15 – Legal overview, European contagion and systemic issues as of 11 February 2026

February 21, 2026
  1. Legislative process for Bill No. 2107 of 18 November 2025 and current status of the text:

Submitted on 18 November 2025 to the National Assembly by Ms Laure Miller and several members of the Ensemble pour la République group, Bill No. 2107[1] aims to protect minors from the risks associated with the use of social media by introducing a blanket ban on access for minors under the age of 15.

After review by the Committee on Cultural Affairs and Education, then in public session, the National Assembly adopted the text at first reading on 26 January 2026 (130 votes in favour, 21 against), under an accelerated procedure initiated by the Government on 23 January 2026.

The text was sent to the Senate (registered under No. 304) on 27 January 2026 and is currently awaiting examination at first reading by the Upper House. The Government is maintaining its objective of final adoption and entry into force at the start of the school year in September 2026.

It should be noted that French positive law already recognises a “digital majority” at the age of 15, resulting from Law No. 2023-566 of 7 July 2023, known as the “Marcangeli Law”, which aims to establish a digital majority and combat online hate speech.Article 4 of this law established the principle of prohibiting minors under the age of 15 from registering on social networks without parental authorisation, thereby requiring providers of these services to verify the age of their users in accordance with technical standards developed by ARCOM (the French Regulatory Authority for Audiovisual and Digital Communication). Although enacted, this law has nevertheless come up against the country of origin principle derived from the e-commerce directive, which prevents a Member State from imposing such obligations on social networks not established on its territory.

Due to its non-compliance with European law and the lack of implementing decrees, this law has remained unenforced to date.

Then, while transposing the DSA into domestic law, the law of 21 May 2024 aimed at securing and regulating the digital space (SREN) gave ARCOM the power to block pornographic websites that did not implement appropriate user age checks. However, as the SREN bill was the subject of two detailed opinions from the European Commission, which once again pointed out the risk of duplication with the DSA, the French legislator finally limited this obligation to platforms established in France or outside the European Union, thereby effectively escaping European censorship but automatically limiting the effectiveness of the protection sought.

The 2025 bill is part of a move towards stricter measures, shifting from authorised access before the age of 15 with parental consent to a complete ban on access to social networks for all under-15s.

Main provisions of the text adopted by the National Assembly:

  • Legal ban on access to online social networking services for minors under the age of 15 (reversed wording compared to the initial text: the ban applies to minors and not directly to the platform, in order to circumvent the objections of the Council of State[2] relating to the DSA regulation).
  • Explicit exceptions: collaborative encyclopaedias (e.g. Wikipedia), educational or scientific directories, and free software development and sharing platforms.
  • Exclusion of private interpersonal messaging services (such as WhatsApp).
  • Possibility of exceptional access with the prior express and revocable consent of a person with parental authority for certain platforms (terms and conditions specified by decree).
  • Robust age verification requirement for the services concerned, based on the ARCOM technical reference framework (adopted on 9 October 2024 for pornographic websites) and on solutions compatible with the future EUDI Wallet.
  • Additional measure: obligation for secondary schools to specify in their internal regulations the conditions and locations for the use of mobile phones (no absolute general ban as initially envisaged).
  • Role of ARCOM: monitoring compliance with the ban; reporting to the competent authorities of other Member States for platforms established outside France, in accordance with the DSA cooperation mechanism.

2. Coordination with the roll-out of the European digital identity wallet (EUDI Wallet)

The eIDAS 2.0 Regulation (Regulation (EU) 2024/1183) requires Member States to make at least one national version of the European Digital Identity Wallet (EUDI Wallet) available by the end of December 2026 (24 months from the entry into force of the main implementing acts published in December 2024).

As of 11 February 2026, deployment is in the technical finalisation and certification phase:

  • The architecture and reference specifications (ARF) have been finalised.
  •  
  • Large-scale pilots are testing use cases (travel, education, health, financial services, proof of age).
  •  
  • Five Member States (including France, Denmark, Greece, Italy and Spain) have been participating since July 2025 in a specific age verification pilot project using a white-label “mini-wallet” prototype, which is to be adapted into national versions.

The wallet is based on key principles:

  • Local storage of data on the user’s device (no centralised database).
  • Minimal disclosure and zero-knowledge proofs (proof of age ≥ 15 without revealing date of birth).
  • Strict voluntariness: optional use, prohibition on penalising refusal to adopt.
  • Prohibition of profiling and obligation of “unobservability”.

Under proposal No. 2107, systematic age verification for social networks will most likely be based on this portfolio (or on compatible national solutions), which will theoretically enable implementation in compliance with the GDPR and the principle of data minimisation. However, the widespread adoption of this system by the end of 2026 or early 2027 is a technical prerequisite for the effectiveness of any ambitious national ban.

3. Legislative contagion within the European Union

Since the European Commission’s guidelines of 14 July 2025 on Article 28 of the DSA (protection of minors), several Member States have accelerated their national work, taking advantage of the window opened by the Commission to set a national minimum age “in accordance with Union law”.

Main countries concerned as of 11 February 2026:

  • Denmark: political agreement reached in November 2025 to prohibit access to under-15s (parental exemption possible from age 13); legislative adoption imminent.
  • Spain: draft law under consideration in Parliament aiming at a strict ban for under-16s (without systematic parental exceptions).
  • Italy: proposed restriction to under-15s, coupled with the mandatory use of the future “mini portafoglio nazionale” for age verification.
  • Greece: official statement by the Prime Minister in favour of a ban inspired by the Australian model (16 years old); parental control tools already deployed (“Kids Wallet”).
  • Germany: expert commission currently reviewing the issue (report expected in autumn 2026); citizens’ petition with over 34,000 signatures calling for a minimum age of 16.

At the same time, on 26 November 2025, the European Parliament adopted a non-binding resolution recommending harmonisation at 16 years of age at European level (access possible between 13 and 16 years of age with parental authorisation), accompanied by bans on addictive techniques and advertising targeted at minors.

This dynamic reflects a convergent but fragmented movement that could either result in a national mosaic (risk of conflict with the DSA) or prompt the Commission to propose a delegated act or a targeted revision of the DSA.

4. Alternatives focused on effectively restoring decision-making power to families

A general ban, even if technically feasible via the EUDI Wallet, remains legally fragile in terms of the proportionality test (Art. 11 DDHC of 1789, Art. 10 ECHR, Art. 11 and 52 of the Charter of Fundamental Rights, case law of the Council of State on dynamic restrictions).

 – On the freedom of expression and information of minors:

 Social networks are now considered, in case law, as an essential means of exercising freedom of expression and information:

The Constitutional Council stated, with regard to the Avia law (decision no. 2020-801 DC of 18 June 2020)[3] , that freedom of expression implies the freedom to access online public communication services and to express oneself on them, including on social networks.

The European Court of Human Rights points out that the internet has become one of the main means of exercising freedom of expression and information, so that any blocking or restriction measures must be subject to a strict legal framework with effective judicial review.

 The general ban on access to social media for all under-15s is analysed by legal scholars as a major interference with the exercise of this freedom, which is protected by Article 10 of the ECHR and Article 11 of the Declaration of the Rights of Man and of the Citizen.

The Court considers that a general ban on access to social media for all under-15s is a measure that does not distinguish between illegal content and legal and potentially beneficial content (education, information, socialisation, creativity, civic engagement), and which deprives adolescents of important tools for communication and information.

 Critics point out that, in a democratic society, restrictions on freedom of expression must meet the requirements of necessity, proportionality and adaptation. Less restrictive measures exist (enhanced monitoring by ARCOM, default protection settings, content filtering and moderation, enhanced information, parental guidance) that are capable of achieving the objective of protecting children with less infringement of freedoms.

– Recent case law on blocking a social network (TikTok in New Caledonia):

 The Council of State, hearing an appeal against the Prime Minister’s decision to suspend TikTok in New Caledonia in the context of serious disturbances to public order, ruled that the Prime Minister could, in exceptional circumstances, decide to temporarily interrupt an online communication service, provided that the measure was limited in time and that there were no other less intrusive technical means available.

However, in this case, the total ban on TikTok for an indefinite period, linked solely to the persistence of unrest, constituted a disproportionate infringement of freedom of expression, freedom of communication and access to information, as the measure was not made conditional on the impossibility of alternative measures.

 This case law illustrates the “dynamic” proportionality review exercised by the judge on measures restricting social media, including in a crisis context.

  • Less intrusive alternative measures would make it possible to refocus responsibility on those with parental authority:

It is now clear that social media and certain online content pose a danger to minors (child sexual abuse, pornography, deadly ideologies, incitement to suicide, among others).

It is now widely documented that early and excessive exposure to social media and screens can pose significant risks to minors: exposure to violent or pornographic content, harassment, predatory solicitation, but also impacts on sleep, attention, emotional maturation and cognitive development.

Nevertheless, recognition of a real danger does not necessarily imply that the most radical legal response is the most appropriate in terms of constitutional and treaty principles.

The following measures may be adopted (without this list necessarily being exhaustive):

  • Mandatory default activation of granular parental controls (screen time, content categories, contact validation) for all minor registrations, with recurring notifications and remote revocation.
  • Allow parents to suspend or limit access to specific services or features.
  • Development and massive subsidisation of “junior” phones without browsers or social media apps for children up to the age of 16, offered by default by mobile operators.
  • Mandatory digital education in secondary schools and the introduction of real criminal penalties for owners of websites offering clearly dangerous and/or criminal content.

These approaches better respect the principle of subsidiarity and the primacy of parental responsibility while avoiding a generalised control infrastructure.

The protection of minors is a legitimate and compelling imperative. However, the legal instruments chosen must not pave the way for structural mechanisms whose effects could exceed the initial objective.

5. Medium- and long-term systemic risks

The joint adoption of systematic age verification and the EUDI Wallet creates a powerful technical infrastructure. Although the current legal framework (GDPR, eIDAS 2.0, oversight by the CJEU and national data protection authorities) is robust, several structural risks warrant increased vigilance. The following analysis explores these issues in greater depth, taking into account the French government’s increasing use of tools developed by Palantir and France’s propensity for strict regulatory oversight of online content in a sensitive electoral context.

5.1. The risk of a shift towards a social credit system inspired by the Chinese model

The Chinese Social Credit System (SCS), which has been gradually implemented since 2013 and will be refined until 2026, is not a single, centralised tool but a mosaic of interconnected local and national mechanisms managed by government and private entities. It assigns a score based on financial, social, legal and behavioural data: compliance with laws, payment of debts, social interactions, and even online criticism of the government. “Good citizens” receive rewards (easy loans, priority access to public services, travel), while “bad” citizens are penalised through blacklists (ban on train/air travel, employment restrictions, social exclusion). In 2025-2026, the system evolved to place greater emphasis on positive incentives and “credit repair” mechanisms, but it still relies on mass surveillance (more than 600 million cameras, AI algorithms) and automated alerts, reinforcing social control and ideological conformity.

In the European context, the EUDI Wallet – although designed as a voluntary tool with privacy-by-design – could serve as a vehicle for a similar shift if its use became de facto mandatory through progressive interconnections (e.g. age verification extended to access public, financial or health services). Local initiatives such as the “Smart Citizen Wallet” in Rome and Bologna (Italy), which rewards “virtuous” behaviour (waste sorting, use of public transport) with points that can be converted into goods, foreshadow a social credit system that is incentive-based rather than punitive, but ambiguous in terms of the future AI regulation. Although the European Commission denies any link with a punitive system such as China’s SCS, emphasising the prohibition of profiling and voluntary participation, the potential extension to attributes such as carbon quotas or vaccination requirements could transform a protective tool into a behavioural rating mechanism.

5.2. The use of Palantir by the French authorities and the propensity for strict regulatory oversight

Since 2016, France has distinguished itself by repeatedly using solutions from the American company Palantir Technologies for the processing and analysis of large amounts of data in the field of domestic intelligence. The partnership with the General Directorate for Internal Security (DGSI), initially established in the context of the November 2015 attacks, has been renewed several times, most recently in December 2025 for a period of three years (until 2028). This contract covers the provision of Palantir’s proprietary software platform as well as integration, support and operational assistance services. Although justified by the needs of counter-terrorism and national security (particularly during the 2024 Olympic and Paralympic Games), this renewal comes at a time when France regularly invokes the need for digital sovereignty. It raises persistent questions about dependence on a non-European operator whose technologies, initially developed for US intelligence and defence agencies, are designed for the massive cross-referencing and predictive exploitation of heterogeneous data.

At the same time, France appears to be one of the most active Member States in applying the Digital Services Act (DSA) and complementary national measures (SREN Act of 21 May 2024). The Audiovisual and Digital Communications Regulatory Authority (ARCOM), designated as the national coordinator for the DSA, has extensive powers to investigate, sanction and, as a last resort, block non-compliant content or services. This proactive stance has resulted in the rigorous implementation of obligations to prioritise the removal of hateful content, age verification for pornographic sites and the fight against disinformation.

Some observers describe this approach as an infringement on online freedom and freedom of expression, due to the high volume of reports processed and the speed of administrative measures.

5.3. Electoral context and the temptation to lock down information

The imminent elections – municipal elections on 15 and 22 March 2026, followed by the presidential election in spring 2027 – increase the risk of digital regulation tools being exploited for political ends. In a polarised political climate, where control of the online information space can influence public opinion, the combination of a strict minimum age for access, widespread identity verification and increased use of data analysis platforms such as Palantir could facilitate the preventive control of dissident or perceived destabilising discourse. Such a configuration, even if initially motivated by the protection of minors, could be misused to limit the circulation of critical information or to prioritise the moderation of content deemed contrary to national cohesion or public order, to the detriment of pluralism.

5.4. Gradual marginalisation of the parental role in digital education

Finally, several indicators suggest a trend towards relativising the primacy of parental authority in favour of direct state or institutional regulation. Although proposal No. 2107 formally maintains a parental exemption for certain types of access, parliamentary debates and enforcement measures (systematic age verification, imposed default settings, digital education integrated into school curricula without systematic consultation with families) reflect a logic whereby the state assumes increased responsibility for protecting and supervising minors’ digital use. This development risks removing responsibility from parents and entrusting administrative authorities or algorithms with the task of defining what is lawful or beneficial for children, to the detriment of a subsidiary approach that respects family autonomy.

5.5. Other risks in a context of totalitarian rule or excessive supranational governance

Beyond social credit, digital identity systems such as the EUDI Wallet pose amplified risks in a scenario of totalitarian drift (total control of society by a single party or imposed ideology, through terror, propaganda and surveillance) or excessively centralised supranational governance (loss of national sovereignty to European bodies).

  • Mass surveillance and loss of individual freedoms: Historically, totalitarian regimes have duplicated the state through a single party to control private life via files, wiretapping and denunciation. Digitally, this could translate into an exhaustive “relationship graph” (interconnection of identities, transactions, movements), facilitating state or supranational espionage and the dissolution of anonymity. In a supranational context, dependence on the EUDI Wallet could expose citizens to foreign cyber threats, eroding sovereignty.
  • Ideological manipulation and social exclusion: Totalitarian regimes impose a single ideology, shaping citizens from childhood onwards. An interconnected EUDI Wallet could make access to essential services (banking, healthcare, transport) conditional on “virtuous” criteria (e.g. carbon quotas, regulatory compliance), leading to the automated exclusion of dissidents – a digital “unperson”[4] .
  • Technical and geopolitical vulnerabilities: Massive leaks or hacking (despite local storage) could expose sensitive data, amplifying the risks in authoritarian regimes. In a supranational context, forced harmonisation (via the EU) could dilute national protections, promoting coercive standardisation.

These risks, although hypothetical, highlight the need for enhanced parliamentary oversight, effective remedies and constant citizen vigilance to prevent any abuses.

Proposal No. 2107 is part of a broader European movement, made technically possible by the imminent deployment of the EUDI Wallet at the end of 2026. If it survives the Senate filter and any European appeals, it will mark a turning point towards more prohibitive regulation.

However, a hybrid approach – greater parental responsibility, real criminal penalties for sites with dangerous and/or criminal content, information and communication on the risks and methods available – remains more legally sustainable and strategically resilient in the long term.

In conclusion, a reflection on technological progressivism

Technological progressivism, characterised by a frantic race for innovation without rigorous assessment of the potential consequences or comprehensive consideration of the inherent risks, is undoubtedly a major problem in modern societies, where the quest for efficiency and novelty often overshadows fundamental ethical and anthropological imperatives.

The hasty adoption of technologies such as EUDI Wallet or age verification systems, without prior thorough examination, exposes individuals and communities to systemic vulnerabilities that could irreparably alter social structures and essential freedoms.

Philosophically, this dynamic evokes the Aristotelian distinction between techne (technical art, oriented towards efficient production) and phronesis (practical wisdom, which integrates moral deliberation and prudence in action, in order to co-determine the universal and the particular in concrete situations), emphasising the urgent need to use the latter to temper the excesses of the former.

From the perspective of natural law, an eternal and inalienable law inscribed in human nature, all technological regulation must be subordinate to the preservation of the objective goods of human beings (such as life, liberty and the pursuit of the common good), thus preventing innovation from becoming an instrument of alienation or dehumanisation. In short, such collective recklessness calls for a renaissance of ethical deliberation, where phronesis guides innovation towards authentic humanism, thus preserving the essence of human dignity against the mirages of unlimited progress.


Virginie de Araújo-Recchia

Barrister, author, lecturer, president of ONEST


[1] https://www.legifrance.gouv.fr/dossierlegislatif/JORFDOLE000053406437/

[2] https://www.conseil-etat.fr/avis-consultatifs/derniers-avis-rendus/a-l-assemblee-nationale-et-au-senat/avis-sur-une-proposition-de-loi-visant-a-proteger-les-mineurs-des-risques-auxquels-les-expose-l-utilisation-des-reseaux-sociaux

[3] https://www.conseil-constitutionnel.fr/decision/2020/2020801DC.htm

[4] Orwell G., 1984: this term refers to a person whose existence has been erased by the totalitarian Party.

Share This Article

Leave a Reply

Your email address will not be published. Required fields are marked *

Support us