Published: November 2025 – Romania’s draft Legea majoratului online (“Online Age of Majority Law”) was adopted by the Senate on 6 October 2025. This proposed Romanian digital services regulation introduces robust measures to protect minors under 16 in the online environment. It aligns with EU rules – notably the Digital Services Act (Regulation (EU) 2022/2065) and GDPR (Regulation (EU) 2016/679) – and imposes new legal responsibilities on online service providers. Tech companies offering digital services in Romania will face strict compliance requirements regarding parental consent, age verification, content moderation, and advertising practices involving minors.

Romania Online Age of Majority Law

Context: Protecting Minors Online in Romania and the EU

The Online Age of Majority Law is part of a broader trend in Europe to bolster online safety for minors. It was designed to shield children (under 16 years old) from harmful online content – including violence, pornography, self-harm, hate speech, fraud, substance abuse, and other material likely to impair a child’s development. The initiative complements Article 28 of the EU Digital Services Act (DSA), which mandates a “high level of privacy, safety and security for minors” on online platforms, and Article 8 of the GDPR, which requires parental consent for processing a child’s personal data in online services. In essence, the Romanian law elevates 16 as the “digital age of majority”, aligning with GDPR by treating minors under 16 as requiring special protections and parental involvement in the digital space.

This draft law has moved to the Romanian Chamber of Deputies for final approval. If enacted, it will establish an “online age of majority” in Romania, meaning anyone under 16 will need a parent or guardian’s permission to fully engage with many digital services. Notably, educational platforms officially authorized by the Ministry of Education are exempt, so as not to impede online learning. All other online platforms – from social media and streaming sites to e-commerce, gaming, banking apps and beyond – are within the law’s scope. This broad applicability means a wide range of technology companies must prepare to comply.

Key Obligations for Online Service Providers

Under the Legea majoratului online, providers of online services (paid or free, offered electronically at a user’s request) face several new obligations to protect children. In practical terms, tech companies will need to:

  • Obtain verifiable parental consent for users under 16: Minors (below 16) may access digital services or create personal accounts only with explicit, validated parental or guardian consent. Providers must implement age gating mechanisms so that no child under 16 can use their service without a parent’s approval (verified in line with GDPR standards for consent). It is forbidden to provide services to a sub-16 minor without prior parental consent validation.
  • Retroactively secure consent for existing minor users: Companies must seek and obtain parental consent for all existing accounts held by under-16 users in Romania. They are given a 180-day transition period from the law’s entry into force to collect consents for current underage accounts. Accounts for which no parental consent is provided within that timeframe must be blocked, and after an additional 120 days, deleted entirely.
  • Provide parental control mechanisms: The law empowers parents and legal guardians. Upon request, parents can demand the suspension or deletion of their child’s account or restrict the child’s access to specific websites/pages containing harmful content. Service providers will need to implement straightforward processes (technical mechanisms) to carry out these requests – for example, channels to receive verified parental takedown requests and promptly disable the minor’s account or block certain content for that user. These requests will be coordinated with Romania’s telecom regulator (ANCOM) which will oversee their enforcement.
  • “Age-appropriate” interface design and content filtering: Providers must adopt age-appropriate design principles in their platforms when minors are involved. This includes designing default settings, user interfaces, and content filters to maximize minors’ privacy, safety, and security. Practical measures might entail high default privacy settings for underage accounts, warning screens or filters for harmful content, and tools to prevent contact from strangers. Content must be tailored and filtered so that minors are not exposed to harmful material identified by the law (e.g. pornography, violence, hate or self-harm content). Importantly, any filtering or safeguards should be proportionate and must not infringe on minors’ fundamental rights (such as access to education or information).
  • Mandatory content labeling by age suitability: Within 180 days of the law’s implementation, online platforms will be required to label or rate content by age category. Similar to movie or game ratings, this likely means categorizing videos, posts, or other content as appropriate for certain age groups (e.g. 12+, 16+). The goal is to clearly signal which content is suitable for minors of different ages and to help automate the restriction of adult-oriented material. Providers will need to develop or adopt a labeling system consistent with national and European standards for child digital protection.
  • Age verification for restricted content: For services or content categories that are legally restricted to adults (18+) or to users over 16, companies must deploy age verification mechanisms. Within 180 days, platforms whose content is inappropriate for under-16 users must install filters to verify user age and block underage access. Acceptable methods might include secure age assurance tools (e.g. AI facial age estimation, electronic ID checks, etc.), used in a privacy-preserving way. Notably, the law states providers are not obliged to collect additional personal data solely to identify minors – privacy-by-design is encouraged, allowing use of probabilistic estimation or gated access that respects data minimization.
  • Ban on targeted advertising to minors: Mirroring the DSA’s provisions, the draft law prohibits any advertising based on profiling when the recipient is a minor. In other words, online platforms cannot serve personalized ads to users known or presumed to be under 16 if those ads rely on tracking or personal data profiling. Contextual ads (not based on personal data) may be allowed, but behavioral advertising using minors’ personal data is off-limits. Tech companies will need to adjust their ad delivery systems to ensure compliance – for example, disabling interest-based ads for underage accounts in Romania.

The law also introduces a new concept of the “online adult” (major online) – once a user turns 16, they are considered to have “full legal capacity” in the online environment. This means parental consent is no longer required for online contracts or account creation once someone is 16 or older. However, until that age, parental consent and enhanced protections apply by default.

Enforcement, Sanctions and Compliance Timeline

ANCOM (The National Authority for Administration and Regulation in Communications) is designated as the chief enforcement authority (the “digital services coordinator” under the DSA framework) for this law. ANCOM will conduct periodic compliance checks, both on its own initiative and in response to complaints. It will work in concert with child protection agencies and the data protection authority (ANSPDCP) to ensure an integrated approach. Service providers should be prepared for regulatory scrutiny once the law is in effect, including potential audits of their age verification systems and content policies.

Non-compliance can lead to significant penalties. The draft law sets out a tiered fines regime based on the nature of the violation:

  • Failure to validate parental consent or to label content (e.g. violating the under-16 parental consent rule, or not implementing the content rating system) can incur fines between 0.1% and 0.2% of the provider’s annual turnover in Romania.
  • Failure to implement required technical measures (e.g. age-verification filters, blocking unverified accounts) is penalized more heavily, with fines from 0.2% up to 0.4% of the company’s national turnover.

For large tech companies, even a fraction of turnover can be substantial. Moreover, repeated violations carry an escalating risk: if a provider commits five or more infractions in a calendar year, ANCOM may order the suspension of that service in Romania until all issues are remedied. This “last resort” sanction – essentially blocking the platform nationally – underscores the law’s strict approach to enforcement. All fines collected will go to the state budget.

Timeline: Once promulgated, certain provisions (like the parental takedown request mechanism) will apply immediately, while others have a grace period. Companies will have 180 days from the law’s entry into force to comply with most obligations (e.g. implement content labeling, obtain consents for existing users, set up age checks). ANCOM is also tasked with issuing detailed technical guidelines within 180 days to clarify how consent verification and content restrictions should be implemented. Additionally, within 120 days of the law taking effect, a joint ministerial order will be issued to approve methodological norms (detailed rules) for enforcement. Tech companies should use this interim period to update their systems and policies, rather than waiting, as compliance will be expected swiftly once deadlines lapse.

Conclusion: Preparing for Compliance and Next Steps

The draft “Online Age of Majority” law marks a significant shift in Romanian digital services regulation, raising the bar for child data privacy and online safety. Technology companies – whether social media giants, game developers, e-commerce sites, or any digital platform accessible to Romanian users – will need to adapt their platforms and practices to these new requirements. This includes auditing current user bases for underage accounts, building or integrating age verification tools, adjusting advertising systems to disable personalized ads for minors, and implementing parental consent management and content filtering features.

Compliance is not only a legal obligation but also a demonstration of corporate responsibility towards younger users. The changes may be complex, but they also present an opportunity to build trust with families by providing a safer digital environment for children. In the coming months, companies should stay alert for the final passage of the law and the subsequent implementation guidelines from authorities.

At BMA Legal, we specialize in advising technology companies on regulatory compliance. Our team is closely monitoring the progress of this law and similar EU initiatives on GDPR and online minors’ protection. Given the potential sanctions (up to 0.4% of turnover) and operational impacts, early action is prudent.

Contact BMA Legal for expert counsel on navigating the Online Age of Majority Law and tailoring your compliance programs. We offer comprehensive support to ensure your business meets these new obligations. Reach out to our team to learn how we can support your compliance journey under this emerging regulation.

Table of content