
March 20, 2025
- Jaka Repansek, Founder, Republis
Navigating the future of gaming: the impact of European digital regulations on the industry
JAKA REPANŠEK, PRESIDENT OF THE SLOVENIAN ADVERTISING TRIBUNAL, EXAMINES THE EU’S DIGITAL SERVICES ACT’S IMPLICATIONS FOR THE GAMING INDUSTRY
The Digital Services Act (DSA), enacted by the European Union, represents a significant overhaul of the digital regulatory landscape, aiming to enhance user safety and platform accountability across various online services. Whilst the gaming industry is not specifically targeted by the legislation, it will nevertheless be impacted by its various provisions. DSA was formally adopted by the European Union on October 19, 2022, and came into force on November 16, 2022. It became applicable to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) on August 25, 2023, with full application to all digital services from February 17, 2024. The legislation was introduced as part of the EU Digital Strategy to create a harmonized regulatory framework for digital services, ensuring transparency, safety, and fairness in online markets.
The enforcement of the DSA is dual-tiered, involving both national regulators within Member States and centralized oversight by the European Commission. Each EU Member State is required to designate a Digital Services Coordinator (DSC), responsible for monitoring compliance and enforcing the DSA within its jurisdiction. These regulators ensure that companies headquartered in their territory adhere to the obligations outlined by the legislation. The European Commission has direct regulatory authority over Very Large Online Platforms and Very Large Online Search Engines – those platforms with over 45 million users in the EU. The Commission has the power to impose fines and corrective measures for non-compliance. Companies violating the DSA can face fines of up to six percent of their global turnover. Repeated or severe breaches may result in temporary bans from operating in the EU digital market.
The role of gaming commissions
The enforcement of the DSA has significant implications for gaming commissions across the EU, as many gaming platforms fall under its scope. Gaming commissions are expected to oversee compliance with transparency obligations, particularly concerning loot boxes, microtransactions, and gambling-like mechanics in online games. Regulators must work with gaming companies to ensure strict enforcement of age verification systems and content classification mechanisms and are expected to coordinate with Digital Services Coordinators to identify risks associated with in-game advertising, fraudulent transactions, and consumer protection issues. Since many online games utilize aggressive advertising and behavioral monetization models, gaming commissions will have greater oversight in accordance with DSA over transparent advertising disclosure, ensuring that consumers are not misled by deceptive marketing practices.
Content moderation and user protection
The core objective of the DSA is to combat illegal and harmful content online. For gaming platforms, this necessitates the implementation of robust content moderation systems to address issues such as hate speech, harassment, and misinformation. Platforms are required to establish mechanisms that allow users to report inappropriate content, which must be addressed promptly to ensure compliance. This proactive approach mandates the use of advanced moderation tools, including artificial intelligence and machine learning, to effectively monitor and manage user-generated content. Moreover, the DSA emphasizes the promotion of diverse opinions within online communities. Gaming platforms must cultivate environments where users can express themselves without fear of harassment. This involves implementing clear guidelines against abusive behavior and fostering inclusive and respectful interactions among players.
Data privacy and user consent
The DSA reinforces stringent data protection measures, aligning with the General Data Protection Regulation (GDPR). Gaming companies are obliged to handle user data with heightened responsibility, ensuring that personal information is processed transparently and only with explicit user consent. This includes providing users with clear options regarding data sharing and implementing robust security protocols to protect personal information.
Liability and accountability
Under the DSA, the liability framework for online platforms has been redefined. Gaming companies are now held accountable for the content disseminated on their platforms, necessitating swift action against illegal material upon awareness. Failure to comply can result in substantial penalties, including fines of up to six percent of global revenue. This shift underscores the importance for gaming platforms to establish comprehensive compliance strategies and collaborate closely with legal experts to navigate the evolving regulatory landscape.
The DSA introduces rigorous transparency requirements, compelling platforms to disclose their content moderation policies and practices. This includes publishing annual transparency reports detailing the number of content removal actions, the rationale behind these decisions, and the measures taken to inform users. Such transparency fosters trust between users and platforms, ensuring that moderation practices are fair and accountable. Additionally, platforms must provide clear explanations to users when actions are taken against them, offering avenues for appeal and ensuring that users understand the reasons behind moderation decisions. This requirement necessitates the development of user-friendly communication channels and support systems to handle disputes effectively.
Impact on business models and advertising
The DSA’s restrictions on targeted advertising, particularly concerning minors, have significant implications for companies that rely on ad-based revenue models. Platforms must reassess their advertising strategies to ensure compliance, potentially shifting towards contextual advertising methods that do not exploit personal data. This transition may affect revenue streams but aligns with the broader goal of protecting user privacy. Gaming platforms must clearly disclose how advertisements are targeted, the data sources used, and the logic behind ad delivery. Platforms must offer users an opt-out mechanism from targeted advertising, aligning with broader privacy protections under the GDPR. The DSA bans personalized advertising targeted at minors, requiring gaming companies to adopt age-verification mechanisms to prevent data collection from underage users. Furthermore, the regulation forces a shift toward contextual advertising, where ads are displayed based on game content rather than user profiling. In response to EU regulatory pressure, Google Play revised its advertising policies, introducing child-directed services that limit the use of behavioral tracking for users under 18. This forced game developers to rethink ad placement and rely more on non-personalized ad revenue models.
Implications of the DSA for gaming companies outside the EU
One of the most significant aspects of the DSA is its extraterritorial application. Gaming companies based outside the EU must comply with the DSA if they offer services to European users or have a significant number of players within the EU market. Non-EU gaming platforms that reach a certain user threshold in Europe must appoint a legal representative in the EU.They must adhere to content moderation, transparency, and data protection requirements.
Data privacy, sharing, and user consent (Data Act & DGA)
New digital regulation of the EU also includes the Data Act and the Data Governance Act (DGA), which expand upon the GDPR by regulating how companies access, share, and store data. In the gaming industry, these laws introduce new compliance requirements related to data portability and user control. The Data Act mandates that users have increased control over their in-game data, allowing them to transfer their data across different platforms. Companies must facilitate equitable data sharing mechanisms, ensuring that game developers, publishers, and third parties adhere to strict ethical and legal frameworks when using player data. The DGA establishes rules for data intermediaries, ensuring that game-related personal and non-personal data is stored and processed under EU-approved frameworks. These regulations necessitate adjustments in how gaming companies handle user accounts, cloud saves, in-game purchases, and behavioral analytics.
Recent cases
Epic Games v. Apple: The legal battle between Epic Games and Apple highlighted issues related to platform control and revenue sharing. Epic challenged Apple’s App Store policies, particularly the 30 percent commission fee and restrictions on alternative payment methods. The case underscored the complexities of digital marketplaces and the need for regulations like the DSA to ensure fair competition and protect consumer rights.
Valve’s Legal Challenges: Valve Corporation, the operator of the Steam platform, faced legal scrutiny in the European Union for practices such as geo-blocking and refund policies. The European Commission fined Valve and other publishers for restricting cross-border sales within the EU, a practice contrary to the Digital Single Market initiative. This case emphasizes the importance of compliance with regional regulations to avoid legal repercussions.
Temu’s Investigation by the European Union: In October 2024, the European Union initiated an investigation into the Chinese e-commerce platform Temu for allegedly failing to prevent the sale of illegal products. The probe focuses on Temu’s compliance with the DSA’s requirements to prevent the sale of non-compliant products and the reappearance of previously suspended traders. This case illustrates the EU’s commitment to enforcing the DSA and the potential consequences for platforms that fail to comply.
Conclusion
The Digital Services Act signifies a transformative shift in the regulation of digital services within the European Union, with profound implications for the gaming industry. By enforcing stringent content moderation, data privacy, and transparency requirements, the DSA aims to create a safer and more accountable online environment. Adapting to the DSA’s requirements presents challenges for gaming companies, especially smaller enterprises with limited resources. Compliance necessitates investment in advanced content moderation technologies, legal counsel, and staff training. However, non-compliance poses greater risks, including hefty fines and reputational damage.
Companies are advised to conduct thorough assessments of their current practices, engage with legal experts, and develop comprehensive compliance strategies to align with the DSA’s mandates. Additionally, the convergence of DSA, GDPR, Data Act, and DGA fundamentally reshapes advertising strategies in online gaming, demanding greater transparency, ethical data use, and compliance with strict privacy laws. While these regulations impose operational challenges, they also present an opportunity for gaming companies to build trust-based advertising models, fostering sustainable and legally compliant monetization. Failure to adhere to these evolving requirements could result in severe financial penalties and market access restrictions, making compliance an urgent priority for gaming companies operating within the EU.