When we think about children’s data protection, the first issues our minds usually jump to are topics like social media. And that would make sense – online social networks make up a great portion of kids’ internet usage and therefore pose a proportionally high risk.
But what is often overlooked is the fact that many children are also spending their time playing online video games. A recent report found that 76% of kids younger than 18 in the United States play video games regularly.
This is a problem because, like social media, online gaming platforms collect a large amount of data from their users. This includes personal information like names, addresses, and birthdays, as well as more sensitive data like GPS location and biometric data. And, due to the nature of gaming, this data is often collected without the user’s knowledge or consent.
This raises a number of concerns about children’s privacy and data security, as well as the potential for misuse of this information. In this article, we’ll explore some of the key issues related to children’s online gaming and data protection, as well as what measures can be taken to mitigate these risks.
The Safety and Data Risks Faced by Children in Online Gaming
Children and youth are uniquely vulnerable to the dangers posed by the internet. They are still in the process of developing both physically and mentally, which can make them more susceptible to harm. This is especially true in the case of video games, where a slew of potential risks exist.
Addiction
Children’s data can be used to exploit their vulnerabilities and hook them into playing video games for long periods of time. This can lead to addiction, which in turn can have several negative consequences. These include social isolation, sleep deprivation, and even poor academic performance. In severe cases, it can lead to mental health problems.
Manipulation
One of the biggest dangers children face when gaming online is manipulation. Game developers and companies have a vested interest in keeping players engaged, and they often do this by using personal information to curate highly targeted in-game advertisements and content. This can be extremely persuasive, and children may be coerced into making social connections or purchases that they wouldn’t otherwise make.
Contact Risks
Another potential danger of online gaming is the possibility of contact risks. When players reveal their personal information, such as their email address or home address, they open themselves up to the possibility of being contacted by someone they don’t know. This can be especially dangerous for young children, who may not yet have the ability to distinguish between safe and unsafe people.
Gambling-Like Mechanisms
Many online games make use of gambling-like mechanisms, such as loot boxes, that can entice players to spend more money. These mechanisms are particularly risky for children, who may not have a full understanding of how they work or the potential financial consequences.
International Examples of Legislative Age Assurance Requirements
As experts have sounded the alarm over children’s data security in the scope of online play, governments have responded through the proposal and institution of several regulatory frameworks aimed at addressing the problem. A number of noteworthy pieces of legislation have come into force around the world over the past few years, and while each differs slightly in content, they all have one common goal: doubling down on companies’ responsibility to protect their youngest users.
Here are just a few examples of prominent regulatory frameworks to have been rolled out in major countries and regions:
U.K. Information Commissioner’s Office Age-Appropriate Design Code
The age-appropriate design code, informally known as the Children’s Code, was first implemented by the UK’s Secretary of State in September 2020 in an effort to codify the rules and enforcement procedures surrounding online services that process children’s data. It applies to any company that offers online services – such as social media platforms, apps, websites, or gaming services – that are likely to be accessed by children under the age of 18.
The AADC outlines standards on 15 different topics:
● Best interests of the child
● Data protection impact assessments (“DPIA”)
● Age-appropriate application
● Transparency
● Detrimental use of data
● Policies and community standards
● Default settings
● Data minimization
● Data sharing
● Geolocation
● Parental controls
● Profiling
● Nudge techniques
● Connected toys and devices
● Online tools
Each of these covers a unique facet of online service design, but all work together to create a robust sense of protection for minors. Companies are expected to take a risk-based approach to their compliance for each, meaning that the solutions they implement should be appropriate for the risks posed by their products.
While failure to comply with the Age-Appropriate Design Code itself does not make a person or business liable to legal proceedings, it does open their risk to being prosecuted for violation of the UK GDPR and/or PECR.
OECD Recommendation on Children in the Digital Environment
Adopted in 2021, the OECD Recommendation on Children in the Digital Environment is a formal set of guidelines aimed at promoting children’s data safety online. It sits in tandem with the OECD’s Digital Service Provider Guidelines to outline the organization’s position on data governance for digital economy actors.
The Recommendation is unique in that it is non-binding, meaning that countries are not held to its standards in a legal sense. However, it does provide a sort of international benchmark for how different nations might approach regulation in this area.
The main tenet of the OECD’s recommendation is to create online environments in which online providers take the “steps necessary to prevent children from accessing services and content that should not be accessible to them, and that could be detrimental to their health and well-being or undermine any of their rights.”
EU Digital Services Act
The EU Digital Services Act is a newer piece of legislation that was just agreed to by EU members in April 2022. It’s set to be the Union’s main ‘rulebook’ when it comes to protecting citizens’ online privacy both now and in the future as big tech continues to redefine the way we interact with the internet.
Under the DSA, online service providers will be held to higher standards when it comes to the way they process the personal information of both child and adult EU citizens. The Act includes several provisions specifically aimed at protecting minors, including a ban on advertising aimed at children and the algorithmic promotion of content that could potentially cause them harm such as violence or self-harm.
Once formally adopted by EU co-legislators, the Digital Services Act will apply after 15 months, or January 1, 2024, whichever is later. It’s being lauded as a major first step in the effort to protect children’s (and all users’) privacy online and has set the standard for future frameworks of its kind.
UK Online Safety Bill
While still before the UK’s House of Commons, the Online Safety Bill is another potential change to come in the data privacy landscape. It addresses the rights of both adults and children when it comes to their data online, with a special focus on the latter.
If passed, the bill would impose a safety duty upon organizations that process minors’ data to implement proportionate measures to mitigate risks to their online safety. While the legislation has had a few bumps in the road since its original proposal, new UK Prime Minister Liz Truss says she plans to adapt and move forward with it in the coming months.
California Age-Appropriate Design Code Act
California is no stranger to data privacy laws. Honing one of the most comprehensive sets of state regulations in North America, the CCPA, its priorities are clearly set on protecting citizens’ rights and personal information online. In our “California Consumer Protection Act (CCPA) Fines” blog post we discuss which companies the act would apply to, the basics of the CCPA, the penalties for violating the law, and the proposed changes that could affect the law in the future.
The state’s government has just taken another step in that direction with the Age-Appropriate Design Code Act, which unanimously passed a Senate vote on August 29th of this year.
If enacted by Governor Newsom, it will require businesses to take extra measures to ensure their online platforms are safe for young users. This entails regulating things like the use of algorithms and targeted ads, as well as considering how product design may pose risks to minors.
An August 2022 article on the legislation in The New York Times stipulated that when signed, the CAADCA “could herald a shift in the way lawmakers regulate the tech industry” on a broad level in the United States. It pointed to the fact that both regional and national laws in the country have a proven ability to affect the way tech companies operate across the board, and a change in California could very well mean a change for the rest of the US.
Emergent Solutions
Recent regulatory frameworks in data privacy have marked a massive shift in the way companies are required to handle and protect the personal information of their users, with a specific focus on children. In response, many online platforms and service providers have made changes to their terms of service and product design in order to adhere to these new standards.
Some of the biggest emerging solutions include:
Privacy by Design
Privacy by design is an engineering methodology that refers to the incorporation of data privacy into the design of products, services, and systems. The goal is to ensure that privacy is considered from the very beginning of the development process, rather than being an afterthought.
There are seven principles of privacy by design:
1. Action that is proactive not reactive, preventive not remedial
2. Privacy as a default setting and assumption
3. Privacy embedded into design
4. Full functionality – positive-sum, not zero-sum
5. End-to-end security and full lifecycle protection
6. Visibility and transparency
7. Respect for user privacy
The privacy by design methodology was first introduced in the 90s by Ontario Privacy Commissioner Ann Cavoukian. It’s considered one of the most important data privacy frameworks in the world, and its principles are being promoted as a basis upon which online video games and other digital platforms can better protect children’s privacy.
Risk-Based Treatment
As has been seen in recent years, data protection legislation is moving away from a one-size-fits-all approach and towards a more risk-based treatment of personal information. This refers to the idea that data controllers should consider the risks posed by their processing activities when determining what measures to put in place to protect the rights and freedoms of data subjects. For children, this means taking into account the fact that they are a vulnerable population and tailoring data protection measures accordingly.
Responsible Governance
Responsible governance refers to the ethical and transparent management of data by organizations. It’s based on the principle that data should only be collected, used, and shared in a way that is transparent to the individual and serves their best interests.
There are four main pillars of responsible governance:
Transparency: individuals should be aware of how their data is being used and why
Choice: individuals should have the ability to choose whether or not to share their data
Responsibility: organizations should be held accountable for their use of data
Security: data should be protected against unauthorized access, use, or disclosure
The concept of responsible governance is gaining traction as a way to protect children’s privacy online. It’s being promoted as a means of ensuring that data collected from children is only used in ways that are beneficial to them, and not for commercial or other ulterior purposes.
Parental Controls
In the face of ever-growing concerns about children’s privacy online, many parents are taking matters into their own hands by implementing parental controls on their devices and home internet networks. There are several different ways to go about this, but some of the most popular methods include setting up child-friendly browsers and content filters, as well as using apps that track screen time and limit app usage. While parental controls are not a perfect solution, they can be a helpful way to give parents some peace of mind when it comes to their kids’ online activity.
Video games can help children develop their creativity, social skills, and knowledge. However, as digital technologies become more sophisticated and firmly entrenched in our daily lives, it is increasingly important that we begin to structure them in a way that considers and respects children’s privacy rights. By understanding the trends in data protection, and by implementing responsible governance practices, we can help create a safer and more secure online environment for children to play and learn in.