The United States wants big tech companies like Apple to protect children online by adding age verification safeguards to the App Store. It's a political push that completely ignores what protections Apple already provides to parents and children.

Lawmakers have been particularly keen to protect children from online dangers, and have repeatedly demanded big tech companies like Apple and Google do more to help. In the latest attempt to make big tech bend to its demands, the U.S. government is going after the App Store.

The App Store Accountability Act (ASAA) was introduced in May as a way for parents to get more tools to protect their children online. In late November, the ASAA was brought up in Congress as part of a raft of measures to keep kids safe online, led by the Kids Online Safety Act (KOSA).

It's also due to be discussed as part of a House Energy and Commerce Committee hearing. A report on Monday by The Verge says that the discussion will look at a nine-bill package of measures.

Under the ASAA, introduced by Sen. Mike Lee (R-UT) and Rep. John James (R-MI), app storefronts, like Apple's App Store and Google Play, will be required to verify the age of all users in a privacy-protecting way. The result can then be used to limit what apps would be accessible to the user, if they are deemed too young.

Accounts that are used by minors must be linked to a parental account, which would need to provide parental consent for downloading apps or making purchases. App Stores also have to meet standards like providing secure age verification and accurate app age ratings.

While there are some measures in place, such as California's age check law, the intention of the bills are to make things the same across the United States. Instead of dealing with various laws and measures in different states, there would be one set of overriding rules that offer close to the same protections.

It already exists

Much like many attempts to legislate technology, there is a difference between intention and looking at reality. Had lawmakers looked closer, they would see that Apple already has something in place that does just what they asked.

Apple's Family Sharing system allows for a parent or guardian to create an Apple Account for a child under 13 years old. Child accounts can be managed by the parent account in various ways, including Screen Time limits and "Content & Privacy Restrictions" affecting what they can see.

This can limit how much mature or explicit content a child can see in the App Store or in various other apps and services, including podcasts, music videos, and even Apple Books. This also extends to the App Store, with controls allowing parents to require the child account to request access to an app or to make a purchase.

After releasing a whitepaper in February, Apple said in July that it would change the experience for child accounts in iOS 26, including a simplified setup process. The age of the child would also be shared with app developers in the format of an age range, so content can be further tailored to them.

Apple also wanted to expand its age ratings in the App Store to five categories, including new levels for 13+, 16+, and 18+.

A lot of this covers the main thrust of the bill, though age verification is the difficult part. Unlike an adult, a child is unlikely to have much in the way of computer-readable proof of their age.

Laptop screen displaying colorful game advertisements for Candy Crush Saga and Friday Night Funkin'. Left sidebar shows categories like games, apps, and arcade.

The App Store already has some robust protections for younger users.

However, as shown at the time of the Utah version of the bill's implementation, Apple does use credit card requests to the connected adult account in creating the child account.

It's not just Apple that has this in place. Google also has its own parental management system, which will also face scrutiny due to the discussion of the bill.

A shift in blame

One element of the bill that won't sit well with Apple is that it effectively shifts the blame for any lapses to it and Google. This is ridiculous.

Currently, the onus is placed on the creator apps and services to work to protect children from content, which, frankly, is how it should be. However, by requiring Apple and Google to make the checks in their respective storefronts, there is less of a need for those services to worry about being attacked by parent groups and critics.

It therefore won't be Facebook's fault for its moderation system constantly letting explicit material slip through like it does now, nor a web browser providing access to an adult website.

It'll be Apple or Google instead of who's actually responsible, because those App Stores had to do the first set of age checks.

Close-up of App Store icon with a red notification badge showing the number three against a dark red background.

If implemented, the App Store would get more blame for age-related content failures.

At the time of its introduction, Congressman James said the ASAA "holds Big Tech companies to the same standard as local corner stores."

By that standard, the CEO of McDonald's would be directly responsible for a broken ice cream machine at a branch. The management of 7-Eleven would be culpable if a customer bought a Snickers bar from one store, took it home, and fed it to someone allergic to peanuts.

If held true, Apple would be to blame for the failings of app developers and services, despite abiding by the law.

It's something that social networks are already very well aware of being beneficial to them.

In a letter to James, Pinterest CEO Bill Ready said the social network endorsed the bill. Calling the need for a federal version "urgent," Ready wrote that it would "reduce fragmentation while giving families one simple place to approve the apps their teens download."

Such laws would both shift blame from Pinterest and services for not policing enough, and give no incentive to improve said policing in the first place.

Whose age is it anyway?

Performing age checks at the App Store level are also somewhat limited by the reality of device usage. Certainly, it can help protect children at the point of downloads and purchases, but there's no guarantee that the age of the person using the device itself matches the account.

This is less of a problem for locked-down child accounts, and more one for accounts registered to adults. There have been many occasions where a parent opens an app and turns their personal iPad into a babysitter, leaving it on their account instead of going through the process of making a child account.

The bill would certainly protect the child account, but it would have no control over a child being allowed to look at content on an adult's account.

Ideally, this would be a situation that would be aided by services performing age checks, independent of the App Store.

Hard problem, poorly considered fix

Bills like the ASAA have good intentions, but are frequently incompatible with real life. Anything that lawmakers put in place to impede a child's access to explicit material can — and does — make normal internet users unhappy.

This is partly because the laws add obstacles between a user and their legitimately-viewable content. And as is typical from any of our elected officials, because there's no real thought to the unintended consequences.

As an example, take the UK's effort, the Online Safety Act. That rule required all online services and websites above a certain size to use age verification tools to limit access to potentially harmful content, or face fines and legal action.

A person with a child looks at a tablet surrounded by icons of an hourglass, PG rating, and a pie chart.

Apple released a whitepaper in February 2025 about future age checks. - Image Credit: Apple

Since its implementation in the summer, the law has led to many services considered to be relatively pedestrian to include age checks. In the case of Imgur, the image platform blocked access to UK users, including content embedded from the service on other websites.

The implementation was also astoundingly poor, as sites required either a credit card, a verifiable form of ID, or used age estimation using a webcam. The latter was easily and hilariously defeated by a person pointing the camera at a computer generated face, such as using the game Death Stranding's Photo mode.

There was also a spike in the number of people using VPNs to get around the restrictions.

The UK's attempt to make the Internet safer for children damaged areas for other users in the country, and created the perfect conditions for even more damaging privacy breaches.

The attempt by the United States to limit child access to damaging material is a just one. That certainly cannot be argued against at all.

However, accomplishing that requires care and thought by the content hosts and the feds who want to regulate it without knowing what they're talking about, because the remedy could easily become worse than the disease. The UK has certainly demonstrated that.

At the same time, making the App Store a focus for age checks is shortsighted, since it shifts a lot of the blame onto companies operating the stores in the first place, rather than the creators or actual hosts of the content who should be the ones held responsible.

If implemented, Facebook might get some bad press if it circulated an adult image to minors given that its moderation is terrible, and has always been so. The internet is basically infinite monkeys on infinite keyboards hammering out Shakespeare — or porn.

But, in this case, Apple would still ultimately get the blame, and the lawsuits, even though it followed the law.