Data Protection Trends in Children’s Online Gaming

When we think about children’s data protection, the first issues our minds usually jump to are topics like social media. And that would make sense – online social networks make up a great portion of kids’ internet usage and therefore pose a proportionally high risk.

But what is often overlooked is the fact that many children are also spending their time playing online video games. A recent report found that 76% of kids younger than 18 in the United States play video games regularly.

This is a problem because, like social media, online gaming platforms collect a large amount of data from their users. This includes personal information like names, addresses, and birthdays, as well as more sensitive data like GPS location and biometric data. And, due to the nature of gaming, this data is often collected without the user’s knowledge or consent.

This raises a number of concerns about children’s privacy and data security, as well as the potential for misuse of this information. In this article, we’ll explore some of the key issues related to children’s online gaming and data protection, as well as what measures can be taken to mitigate these risks.

The Safety and Data Risks Faced by Children in Online Gaming

Children and youth are uniquely vulnerable to the dangers posed by the internet. They are still in the process of developing both physically and mentally, which can make them more susceptible to harm. This is especially true in the case of video games, where a slew of potential risks exist.

Addiction

Children’s data can be used to exploit their vulnerabilities and hook them into playing video games for long periods of time. This can lead to addiction, which in turn can have several negative consequences. These include social isolation, sleep deprivation, and even poor academic performance. In severe cases, it can lead to mental health problems.

Manipulation

One of the biggest dangers children face when gaming online is manipulation. Game developers and companies have a vested interest in keeping players engaged, and they often do this by using personal information to curate highly targeted in-game advertisements and content. This can be extremely persuasive, and children may be coerced into making social connections or purchases that they wouldn’t otherwise make.

Contact Risks

Another potential danger of online gaming is the possibility of contact risks. When players reveal their personal information, such as their email address or home address, they open themselves up to the possibility of being contacted by someone they don’t know. This can be especially dangerous for young children, who may not yet have the ability to distinguish between safe and unsafe people.

Gambling-Like Mechanisms

Many online games make use of gambling-like mechanisms, such as loot boxes, that can entice players to spend more money. These mechanisms are particularly risky for children, who may not have a full understanding of how they work or the potential financial consequences.

International Examples of Legislative Age Assurance Requirements

As experts have sounded the alarm over children’s data security in the scope of online play, governments have responded through the proposal and institution of several regulatory frameworks aimed at addressing the problem. A number of noteworthy pieces of legislation have come into force around the world over the past few years, and while each differs slightly in content, they all have one common goal: doubling down on companies’ responsibility to protect their youngest users.

Here are just a few examples of prominent regulatory frameworks to have been rolled out in major countries and regions:

U.K. Information Commissioner’s Office Age-Appropriate Design Code

The age-appropriate design code, informally known as the Children’s Code, was first implemented by the UK’s Secretary of State in September 2020 in an effort to codify the rules and enforcement procedures surrounding online services that process children’s data. It applies to any company that offers online services – such as social media platforms, apps, websites, or gaming services – that are likely to be accessed by children under the age of 18.

The AADC outlines standards on 15 different topics:

●          Best interests of the child

●          Data protection impact assessments (“DPIA”)

●          Age-appropriate application

●          Transparency

●          Detrimental use of data

●          Policies and community standards

●          Default settings

●          Data minimization

●          Data sharing

●          Geolocation

●          Parental controls

●          Profiling

●          Nudge techniques

●          Connected toys and devices

●          Online tools

Each of these covers a unique facet of online service design, but all work together to create a robust sense of protection for minors. Companies are expected to take a risk-based approach to their compliance for each, meaning that the solutions they implement should be appropriate for the risks posed by their products.

While failure to comply with the Age-Appropriate Design Code itself does not make a person or business liable to legal proceedings, it does open their risk to being prosecuted for violation of the UK GDPR and/or PECR.

OECD Recommendation on Children in the Digital Environment

Adopted in 2021, the OECD Recommendation on Children in the Digital Environment is a formal set of guidelines aimed at promoting children’s data safety online. It sits in tandem with the OECD’s Digital Service Provider Guidelines to outline the organization’s position on data governance for digital economy actors.

The Recommendation is unique in that it is non-binding, meaning that countries are not held to its standards in a legal sense. However, it does provide a sort of international benchmark for how different nations might approach regulation in this area.

The main tenet of the OECD’s recommendation is to create online environments in which online providers take the “steps necessary to prevent children from accessing services and content that should not be accessible to them, and that could be detrimental to their health and well-being or undermine any of their rights.” 

EU Digital Services Act

The EU Digital Services Act is a newer piece of legislation that was just agreed to by EU members in April 2022. It’s set to be the Union’s main ‘rulebook’ when it comes to protecting citizens’ online privacy both now and in the future as big tech continues to redefine the way we interact with the internet.

Under the DSA, online service providers will be held to higher standards when it comes to the way they process the personal information of both child and adult EU citizens. The Act includes several provisions specifically aimed at protecting minors, including a ban on advertising aimed at children and the algorithmic promotion of content that could potentially cause them harm such as violence or self-harm.

Once formally adopted by EU co-legislators, the Digital Services Act will apply after 15 months, or January 1, 2024, whichever is later. It’s being lauded as a major first step in the effort to protect children’s (and all users’) privacy online and has set the standard for future frameworks of its kind.

UK Online Safety Bill

While still before the UK’s House of Commons, the Online Safety Bill is another potential change to come in the data privacy landscape. It addresses the rights of both adults and children when it comes to their data online, with a special focus on the latter.

If passed, the bill would impose a safety duty upon organizations that process minors’ data to implement proportionate measures to mitigate risks to their online safety. While the legislation has had a few bumps in the road since its original proposal, new UK Prime Minister Liz Truss says she plans to adapt and move forward with it in the coming months.

California Age-Appropriate Design Code Act

California is no stranger to data privacy laws. Honing one of the most comprehensive sets of state regulations in North America, the CCPA, its priorities are clearly set on protecting citizens’ rights and personal information online. In our “California Consumer Protection Act (CCPA) Fines” blog post we discuss which companies the act would apply to, the basics of the CCPA, the penalties for violating the law, and the proposed changes that could affect the law in the future.

The state’s government has just taken another step in that direction with the Age-Appropriate Design Code Act, which unanimously passed a Senate vote on August 29th of this year.

If enacted by Governor Newsom, it will require businesses to take extra measures to ensure their online platforms are safe for young users. This entails regulating things like the use of algorithms and targeted ads, as well as considering how product design may pose risks to minors.

An August 2022 article on the legislation in The New York Times stipulated that when signed, the CAADCA “could herald a shift in the way lawmakers regulate the tech industry” on a broad level in the United States. It pointed to the fact that both regional and national laws in the country have a proven ability to affect the way tech companies operate across the board, and a change in California could very well mean a change for the rest of the US.

Emergent Solutions

Recent regulatory frameworks in data privacy have marked a massive shift in the way companies are required to handle and protect the personal information of their users, with a specific focus on children. In response, many online platforms and service providers have made changes to their terms of service and product design in order to adhere to these new standards.

Some of the biggest emerging solutions include:

Privacy by Design

Privacy by design is an engineering methodology that refers to the incorporation of data privacy into the design of products, services, and systems. The goal is to ensure that privacy is considered from the very beginning of the development process, rather than being an afterthought.

There are seven principles of privacy by design:

1.         Action that is proactive not reactive, preventive not remedial

2.         Privacy as a default setting and assumption

3.         Privacy embedded into design

4.         Full functionality – positive-sum, not zero-sum

5.         End-to-end security and full lifecycle protection

6.         Visibility and transparency

7.         Respect for user privacy

The privacy by design methodology was first introduced in the 90s by Ontario Privacy Commissioner Ann Cavoukian. It’s considered one of the most important data privacy frameworks in the world, and its principles are being promoted as a basis upon which online video games and other digital platforms can better protect children’s privacy.

Risk-Based Treatment

As has been seen in recent years, data protection legislation is moving away from a one-size-fits-all approach and towards a more risk-based treatment of personal information. This refers to the idea that data controllers should consider the risks posed by their processing activities when determining what measures to put in place to protect the rights and freedoms of data subjects. For children, this means taking into account the fact that they are a vulnerable population and tailoring data protection measures accordingly.

Responsible Governance

Responsible governance refers to the ethical and transparent management of data by organizations.  It’s based on the principle that data should only be collected, used, and shared in a way that is transparent to the individual and serves their best interests.

There are four main pillars of responsible governance:

Transparency: individuals should be aware of how their data is being used and why

Choice: individuals should have the ability to choose whether or not to share their data

Responsibility: organizations should be held accountable for their use of data

Security: data should be protected against unauthorized access, use, or disclosure

The concept of responsible governance is gaining traction as a way to protect children’s privacy online. It’s being promoted as a means of ensuring that data collected from children is only used in ways that are beneficial to them, and not for commercial or other ulterior purposes.

Parental Controls

In the face of ever-growing concerns about children’s privacy online, many parents are taking matters into their own hands by implementing parental controls on their devices and home internet networks. There are several different ways to go about this, but some of the most popular methods include setting up child-friendly browsers and content filters, as well as using apps that track screen time and limit app usage. While parental controls are not a perfect solution, they can be a helpful way to give parents some peace of mind when it comes to their kids’ online activity.

Video games can help children develop their creativity, social skills, and knowledge. However, as digital technologies become more sophisticated and firmly entrenched in our daily lives, it is increasingly important that we begin to structure them in a way that considers and respects children’s privacy rights. By understanding the trends in data protection, and by implementing responsible governance practices, we can help create a safer and more secure online environment for children to play and learn in.

California Consumer Protection Act (CCPA) Fines

Any company, organization, or marketer that does business online knows (or should know) about the California Consumer Protection Act (CCPA). But with all the talk about the law, it can be hard to understand what it actually is and how it affects businesses. In this article, we’ll take a look at the basics of the CCPA, the penalties for violating the law, and the proposed changes that could affect the law in the future.

What Is the California Consumer Protection Act?

The California Consumer Protection Act (CCPA) is a set of regulatory guidelines imposed upon businesses that collect consumers’ personal data established by the California State Government. It is among the strongest and most stringent privacy laws in the United States and has a far-reaching impact in terms of both the businesses to which it applies and the rights it affords consumers.

The CCPA was passed in response to the numerous high-profile data breaches that have occurred in recent years, as well as the growing concern over the use of personal data by businesses for marketing and other purposes. The law is designed to give consumers more control over their personal data, and to hold businesses accountable for the way they collect, use, and protect that data.

The Provisions of The California Consumer Protection Act

The California Consumer Protection Act covers four principal provisions: the right to know, the right to opt-out, the right to delete, and the right to equal service. We’ll briefly explain each below.

1. The Right to Know

Under the CCPA, consumers have the right to know the personal information businesses collect and how they use it. They’re entitled to the direct disclosure of what categories of data this information falls under and are also given the ability to request further, more specific details about its use as needed. This includes inquiries about what personal information a business has sold, what types of third parties it has sold the information to, and where it got that data in the first place.

(Cal. Civ. Code § 1798.100, § 1798.110, § 1798.115)

2. The Right to Opt-Out

The California Consumer Protection Act mandates that businesses must provide individuals with an easy and direct way to opt-out of the sale of their personal information. The most common way this is done is through a “Do Not Sell My Personal Information” link on a website homepage or cookie preference banner with a similar toggle.

It’s also worth noting that businesses must automatically opt-out of the sale of an individual’s data if they have direct reason to believe that the person is under 16 years old. In these cases, it is only their parent’s, guardian’s, or own decision (if between 13 and 16) to consent to anything otherwise.

(Cal. Civ. Code § 1798.120)

3. The Right to Delete

Individuals protected by the California Consumer Protection Act have the right to request the deletion of their personal information from the entities who collect it. Businesses that receive these requests are obliged to fulfill them upon receipt unless the information they have collected is necessary for things like the completion of a related transaction or contract.

(Cal. Civ. Code § 1798.105)

4. The Right to Receive Equal Service

The CCPA is very clear about discrimination and its intolerance for businesses that use it against consumers who exercise their rights. The law directly prohibits businesses and entities from treating individuals unfairly because they’ve requested to know what personal information is being collected about them, or because they’ve opted out of the sale of their information. This also includes refusing service, providing a lower quality of service, or charging different prices or rates for services.

(Cal. Civ. Code § 1798.125)

Defining ‘Personal Information’

The CCPA’s definition of what qualifies as ‘personal information’ is important to fully understand the scope of the law and how it applies.

As directly written, it considers ‘personal information’ to be any “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” (Cal. Civ. Code § 1798.140(o)(1)).

Examples of what type of data this can cover include:

●         Social Security Numbers

●         Purchase histories

●         Drivers’ license numbers

●         Internet Protocol addresses

The information listed above falls into the personally identifiable information (PII) category. To learn more about PII and how legislation is trying to protect it, view our previous posts: “PIPL: What You Need to Know About Changing Cybersecurity in China”, and “A Guide to the GDPR, Europe’s Stringent Data Protection Law”. Protecting PII is our focus here at TeraDact.

It’s worth noting that while technically meeting the definition, some types of information are not considered to meet the threshold of ‘personal’ and are not subject to CCPA rules. Publicly available information, for example – like someone’s name printed in a newspaper – is not included. Nor is de-identified or aggregate data, which are both defined and further explained in the CCPA itself.

Who Does the California Consumer Protection Act Apply To?

So, who’s subject to all of these rules and provisions? The CCPA was specifically designed to target businesses but can still apply to any organization or person that operates in California and meets at least one of the following criteria.

Annual Revenues Of $25 Million Or Higher

This part is pretty self-explanatory. Businesses making more than $25 million in annual revenue are generally required to comply with the law.

Commercially Buying, Sharing, Receiving, Or Selling the Data of Over 50,000 Consumers Annually

Another clear-cut rule. If your business handles the personal information of more than 50,000 Californian consumers, residents, or households on an annual basis, you’ll have to comply with the law.

It’s important to note that this rule applies even if you don’t share or sell the information you collect – simply having it in your possession puts you over the threshold.

Deriving Over 50 Percent of Annual Revenues from The Sale of Personal Information

This is another fairly straightforward rule, but one that’s worth unpacking a bit. The ‘sale’ of personal information under the CCPA can be broadly defined as anything that would enable access to the data – including exchanging, renting, releasing, disclosing, or otherwise making it available.

So, if more than 50 percent of your business’s annual revenue comes from activities like this, you’ll be required to comply with the law.

What Are the Penalties for Non-Compliance with The California Consumer Protection Act?

Violations of the California Consumer Protection act don’t go unpunished; the law outlines several penalties for non-compliance with its regulations. And because it applies to businesses, service providers, and individuals, there’s a range of potential punishments that could be levied.

Civil Penalties

The most common penalties for violating the CCPA are civil penalties. Civil penalties are a type of financial remedy government entities impose for wrongdoing. In the case of the CCPA, civil penalties are assessed and enforced by the state attorney general’s office, which has the authority to investigate potential violations and file lawsuits on behalf of Californian consumers.

The California Attorney-General can pursue penalties from organizations that violate any part of the California Consumer Protection Act.

Just some examples of what these violations can look like include:

●         Failing to respond to consumers’ requests for the deletion of their personal information

●         Failing to have or uphold CCPA-compliant privacy policies

●         Selling consumers’ personal data without offering them a means to opt-out

●         Discriminating against individuals who exercise their rights under the CCPA

●         Failing to give adequate notice of the collection of personal information

Service providers who retain, use, or disclose personal data for purposes outside of their contracts with businesses may also be liable for penalty under the CCPA.

People can dispose themselves to penalty as well, by unlawfully breaching rules on the onward transfer of personal data.

The costs of violating the CCPA are severe, with maximum fines of up to $2,500 per violation or $7,500 per intentional violation. And because the law applies to each consumer whose data is mishandled, a single incident could result in multiple penalties.

Waiting Period

It’s important to note that businesses that violate the California Consumer Protection Act have a waiting period before they can be fined. The law stipulates that businesses have 30 days’ notice to correct any violations before they can be subject to penalties.

If the business can cure the noticed violation(s) and provide an express written statement indicating so and that no further violations shall occur, then no action may be brought.

Enforcement by The California Attorney-General

The CCPA gives the state attorney general’s office broad enforcement powers, including the authority to investigate potential violations and file lawsuits on behalf of Californian consumers.

In addition to seeking civil penalties, the attorney general can also seek injunctions or temporary restraining orders to stop businesses from violating the law.

Private Right of Action

In addition to the civil penalty route, the CCPA also gives consumers the right to take legal action on their own behalf in the case of a violation. Private action is a term that refers to the ability of an individual to bring a lawsuit against another party without the involvement of the government.

The CCPA gives Californian consumers the right to sue businesses, service providers, or any person acting on behalf of a business or service provider for data breaches that result from the unauthorized access, theft, or disclosure of their personal information.

Consumers can sue for damages even if they haven’t suffered any financial loss because of the breach, and they can also seek punitive damages if the court finds that the business or service provider acted recklessly or intentionally violated the law.

The financial repercussions of these cases are somewhat less severe, with a range of $100 to $750 that can be sought per consumer per incident. Actual damages may also be awarded, but only if the consumer can prove that they’ve suffered a financial loss because of the breach.

(Cal. Civ. Code § 1798.150)

Unlike civil penalties, private action lawsuits do not require consumers to provide notice to businesses of their intention to sue.

Proposed Amendments to the CCPA

Like any major piece of legislation, the California Consumer Protection Act is poised to change with time. This is especially true given the law’s subject matter; because technology is always changing, the ways in which personal data is collected and used will likely continue to evolve.

Considering this, lawmakers have already proposed several amendments to the CCPA. These amendments range from technical corrections to substantive changes that would modify the scope or enforcement of the law.

Some potential prominent amendments to come include:

A Shift Away from Dark Patterns

Dark Patterns are a type of user interface design meant to trick people into doing things they might not want to do, such as signing up for a service they don’t need or providing personal information they might not want to share.

One recently proposed amendment to the CCPA would make it illegal for businesses to use dark patterns when collecting personal information from consumers. This would help to ensure that consumers are only providing their personal data willingly and with full knowledge of how it will be used.

The Right to Correct Personal Information

Newly proposed amendments suggest adding a ‘right to correct’ inaccurate personal information to the CCPA. This new section would give consumers the right to correct any inaccurate personal data businesses collect, as well as outline documentation requirements, methods for correction, disclosure requirements for denial, and alternative solutions.

While relatively new to the CCPA, this concept has been around for some time on an international level and is already familiar to many businesses that are subject to the GDPR. For local, California businesses though, this proposed amendment would simply be another obligation to add to their CCPA compliance checklist.

Privacy Policy Requirements

In addition to the information already required to be disclosed in a privacy policy under the CCPA, proposed amendments would add several new specific elements that businesses would need to include.

These are:

●         The date the privacy policy was last updated

●         The length of time the business plans to retain each category of personal information, or if that’s not possible the criteria it uses to determine how long it will be retained

●         Disclosure of whether the business allows third parties to control their collection of personal data, and if so, the names and business practices of these parties

●         A description of consumers’ new rights as described in the amendment

●         Clear directions for how consumers can exercise their newly amended rights

●         A description of how the business will process opt-out requests

Organizations that process the personal data of 10 million consumers or more will also be required to include a link to certain reporting requirements in their privacy policy under this new amendment.

The CCPA’s reach and impression on business is significant, there’s no doubt about that. The law gives Californian consumers a number of rights with respect to their personal data, and businesses that mishandle that data can be subject to some severe penalties. By educating yourself on the law and taking steps to ensure that your business complies, you can help avoid any potential problems down the road.