This article has been written by Manvee, 4th year law student at Chanakya National Law University, Patna
Introduction
Microsoft recently faced charges for illegally collecting children’s data from users signing up with its Xbox gaming system in the USA. The resultant settlement of $20 million underscores the pressing need to protect children’s data in the evolving landscape of online gaming. The COVID-19 pandemic saw a rapid surge in the online gaming market, with a significant number of gamers falling within the 13-18 age group. This surge led to a wave of leisurely internet use among urban Indian children, raising substantial data privacy concerns.
This trend unveiled a series of issues, ranging from parental unawareness of their children’s online activities to the growing impacts on mental and physical health, including stress, communication hurdles, and even depression.
The Digital Personal Data Protection Act, 2023 (“DPDP Act”) introduces measures aimed at shielding minors from targeted marketing and mandates verifiable parental consent for data processing. While crucial for safeguarding children, these steps can pose challenges for online gaming platforms, potentially impacting their user base and marketing strategies. This article delves into the broader challenges associated with verifiable parental consent and behavioral monitoring under Indian data protection laws. It suggests measures to tackle these challenges and explores comparative jurisdictional analogies addressing issues related to verifiable parental consent and behavioral monitoring.
Challenges that the DPDP Act Poses to the Online Video Gaming Industry for Processing Children Data
- Verifiable Parental Consent for Children
The DPDP Act mandates data fiduciaries to obtain verifiable parental consent[1] before processing the personal data of ‘children’ which it defines as anyone under 18 years old. However, the Act lacks a precise definition for ‘verifiable consent’, creating ambiguity for data fiduciaries regarding the processing of personal data. There exists a need for a precise definition of ‘Verifiable Consent’ because for Data Principals this ambiguity may lead to an unconditional consent as they might not be fully understanding what they are agreeing to and when to provide data as a result compromising their privacy. Moreover, without a clear definition of ‘verifiable consent’, it is harder for the individuals to know how they can withdraw consent or access records of their past choices. For Data Fiduciaries unclear definition will create difficulty in determining what methods will satisfy the ‘verifiable consent’ requirements. As a result, this will lead to non-compliant data collection practise, increasing the risk of regulatory action.
In the context of India’s e-sports and online video game industry, where competitors typically range from 14 to 22 years old, requiring parental consent for those under 18 could discourage gamers to carry on playing esports. The challenge that lies with ‘verifiable consent’ is that it necessitates identification and verification mechanisms to authenticate parental approval for proceeding forward with the game or e-sports. Such stringent requirements would inadvertently prompt esports and gaming companies to implement heightened surveillance practices. These practices will undermine data privacy as ‘verifiable consent’ is intended to secure children but heightened surveillance practices would collect and store more user data than necessary, potentially exposing it to misuse or breaches. Also, stringent verification process would discourage the young gamers from participating as this excessive data collection might be perceived as intrusive, leading them to abandon esports and online gaming altogether. The absence of clear guidelines on ‘verifiable consent’ introduces significant ambiguity for parties signing up regarding the process of obtaining parental consent from children for gaming activities. By defining minors as individuals below 18, the Act overlooks the need for a graded approach, disregarding the distinct needs and capacities of children and teenagers across different age groups. This can be substantiated as per the GDPR which follows the graded approach for the consent to process children’s personal data. The member states will follow the minimum age for legal consent in these circumstances which will range from 13-16 years.
- Prohibition of Behavioural Monitoring for Children
The DPDP Act unequivocally prohibits data fiduciaries from engaging in tracking[2] or behavioral monitoring of personal data belonging to children or minors, a crucial safeguard for their privacy and well-being. However, this directive presents substantial challenges for platforms exclusively designed for younger audiences. Esports and online gaming platforms routinely collect metadata, such as time spent, in-game currency usage, and playing session metrics, to optimize their marketing strategies. When this metadata enables user identification, it falls under the purview of the DPDP Act, subjecting its processing to stringent regulations. By restricting behavioral monitoring of children, esports and gaming companies will face severe limitations in reaching their intended audience, affecting both their revenue streams and overall outreach strategies. In India, the substantial user base of esports predominantly comprises individuals aged 13-18, making explicit parental consent a challenge. The repetitive nature of seeking consent may lead to ‘consent fatigue’ where the users are continuously asked about giving consents via pop-ups, privacy notices, and clicking ‘I agree’, ultimately reducing the user base for free-to-play esports and gaming platforms.
Solutions for Resolving Challenges for Processing Children’s Data in the Online Video Gaming Industry
- Tackling Verifiable Parental Consent
To address the challenge of ‘Verifiable Parental Consent,’ online gaming and esports companies can adopt a range of flexible mechanisms within their free-to-play gaming platforms for children. These mechanisms might include ‘Credit/Debit Card Verification,’ where a nominal refundable amount is charged—let’s say, Rs. 1.5 in the case of India—to verify the parent’s identification, reimbursed within 7-10 business days. Another approach could involve ‘Government Issued Card Verification,’ wherein the front and back of a physical government-issued card are cross-checked against national databases to confirm adulthood, followed by ‘Face Identification Verification’ using camera technology on laptops or mobile devices. This facial recognition feature, integrated either through a third-party service or an in-built platform feature, ensures streamlined verification processes without repeated ID card submissions.
Another potential solution involves ‘data minimization,’ wherein esports and gaming companies limit data collection from users, ensuring minimal processing while adhering to verifiable parental consent. Anonymizing data wherever feasible should also be a priority for these companies.
Guidelines from the government should address the delineation of age categories; not everyone under 18 should be classified as a child. Considering the variance in cognitive abilities and decision-making skills, the definition of ‘minors’ could be adjusted—for instance, lowering the threshold to 16 years in certain contexts. This would offer teenagers more autonomy in their gaming decisions, aligning with their reasonable decision-making abilities.
- Tackling Online Behavioural Monitoring
Esports and Online Video Games often engage children for prolonged periods, and companies utilize behavioral monitoring via collected meta-data to tailor ads accordingly. However, this can be harmful at times. Implementing risk-based age-gating could offer a solution. If a game isn’t suitable for a specific age group, companies could prompt age verification by requiring the child to scan both sides of a government-issued ID card, followed by facial scanning as a secondary check, thus preventing targeted advertisements.
Esports companies can employ Privacy Enhancing Technologies (PETs) such as encryption and secure multiparty computation within their games. Through PETs, these companies can perform necessary advertisements and behavioral monitoring without compromising children’s personal data. Let us suppose a situation where the esports company wants to understand weapon usage trends to balance the game. Traditionally, they might collect data on individual players’ weapon choices. With PETs like encryption, the data gets scrambled before analysis. The company can then analyze the encrypted data to see overall trends in weapon usage like a specific weapon is overwhelmingly popular, without ever knowing which specific players use it. This allows for targeted adjustments without compromising privacy. Thus, using insights from encrypted data, esports companies can deliver relevant ads based on gameplay trends, increasing revenue without compromising privacy.
To ensure user privacy, gaming companies should provide accessible controls for users to manage their personal data. This could involve a privacy notice pop-up when collecting data and this would also satisfy the opt-in requirement under the DPDP Act, a user-friendly navigation panel displaying the collected data. Offering these choices empowers users to make informed decisions about their privacy.
How the issues of Behavioural Monitoring and Verifiable Parental Consent are navigated in other jurisdictions?
- United Kingdom
After the implementation of the Children’s Code in the United Kingdom, all companies, including online video gaming platforms, have addressed the issue of Verifiable Parental Consent. While the implementation of the UK Children’s Code represents a positive step towards online safety for children, the recent industry-wide adoption of default privacy settings for children’s accounts, restricting direct messaging with adults, provides only a partial solution to the issue of verifiable parental consent. Moreover, they implemented age verification methods during game sign-ups and disabled accounts where age validation fails. Regarding Behavioral Monitoring, platforms have disabled notifications during bedtime and ensured limited targeting for users under 18. To bolster security, companies have begun employing digital ID solutions and advanced encryption methods to ensure age verification. For age verification, they’re utilizing third-party age assurance providers. Reliable age verification methods are crucial for proper age-gating. However, these methods must be implemented in a manner that respects user privacy. This means collecting the minimum amount of data necessary, such as through attribute-based systems provided by third-party verification services. These systems simply confirm age or parental authority without revealing unnecessary personal information like passport scans or credit card details. This process guarantees that verification occurs without compromising data privacy and complies with the legal standards set forth by the Code and Online Safety Act.
- European Union
After the implementation of General Data Protection Regulations (GDPR), companies across Europe, including Online Video Game Companies, took measures to address Verifiable Parental Consent. During account creation, they prioritize age verification methods due to GDPR’s flexibility with age consent regulations. For instance, in France, in March 2023, legislation was passed in France’s National Assembly which required social media services to put in place technical solutions to verify the age of their users and to verify if users under the age of 15 have received parental consent. Whereas in Germany’s age verification framework, verifiable parental consent isn’t directly addressed but the system prioritizes confirming user age itself. However, the two-step verification process offers potential adaptation for parental consent purposes. During the initial verification, if a minor is identified, the process could be modified to require parental involvement. This might involve a video call where the parent presents ID and confirms consent, or a unique code sent to a pre-registered parent for approval.
Conclusion
We can conclude that the challenges posed by the DPDP Act 2023 to the Online Video Gaming Industry for processing children’s data need a delicate balance between safeguarding minors and ensuring the industry’s growth. The provision for ‘Verifiable Parental Consent’ and ‘Behavioural Monitoring’ restrictions mandated by the DPDP Act pose significant hurdles for gaming platforms. Innovative solutions like flexible verification mechanisms and age-gating must be explored by the government which will help the video gaming companies to stand-in in the market. Drawing insights from successful implementations in the United Kingdom and the European Union, These include age verification during sign-up (UK) and leveraging regional age verification laws within the GDPR framework (EU). However, verifiable parental consent remains an evolving aspect, with the UK implementing potential methods like video calls and the EU offering adaptation possibilities within its two-step verification process, Indian authorities can refine regulations to align with evolving digital landscapes while prioritizing the protection of children’s privacy and well-being. Striking this balance is crucial for fostering a responsible and legally compliant online gaming ecosystem.
[1] Section 9(1), Digital Personal Data Protection Act, 2023.
[2] Section 9(3), Digital Personal Data Protection Act, 2023.