Apple's age rating system in the App Store is not doing enough to protect children, a report claims, after a day's worth of research found a high proportion of apps deemed acceptable to children actually posed a risk.
Apple has various systems and mechanisms in place to make the use of an iPhone, iPad, or Mac relatively safe for children. Parental controls can limit what age ranges of apps are usable on a child's device, among other features.
However, for those restrictions to actually be useful, the apps themselves have to be rated correctly. In a joint report from the Heat Initiative and ParentsTogether Action, it seems that the ratings aren't doing enough.
The report tasked a researcher with reviewing as many apps as possible within 24 hours. There was also a focus on apps in categories where there was a history of apparent safety risks being presented to kids, including apps about beauty, chatting, diet and weight loss, general Internet access, and gaming.
In that period, about 800 apps were reviewed, but over 200 were found to have "concerning content or features" and were still aimed at children. For example, while chat apps were commonly rated at 17+ with few rated for children, other categories like weight loss were frequently rated for children aged 4+.
"This indicates Apple's rating system is missing not only individual apps within risky, adult-oriented categories, but entire categories of potential harm," the report states.
The risky apps included 25 chat apps with a total of 37 million downloads, consisting of apps that connected with strangers, anonymous chat apps, and AI chat apps. The report adds one of the kids chat apps had users saying it is "nothing but paedophiles."
There were 40 unfiltered Internet access apps in the group, with 291 million combined downloads. Those apps included some that advertised the ability to get around school filters, granting access to banned sites.
Beauty and body-related apps made up 75 of the total risky apps, including encouraging 20-hour fasts and starvation-level calorie goals.
52 gaming apps were found to have risky elements, with some offering "XXXSpicy" features including dares such as streaking. Others were considered to be violent video games, and with players pretending to be murderers and drug dealers.
Questioned process
Part of the problem, according to the report, is how Apple actually handles its age rating process. Rather than relying on a third-party rating system, such as the Entertainment Software Ratings Board (ESRB), Apple instead relies on developers answering a questionnaire about the app's content, which is then used as a basis for the rating.
The report states it is "unclear" if there are any other steps performed by Apple on the matter, if other criteria are used, or if Apple follows up to confirm the age rating after the app is published.
Furthermore, the report accuses Apple of shirking responsibility when its "promises to families to create a safe App Store environment." Apple is accused of pushing legal liability to the developer and shielding itself from accountability.
"If your app is mis-rated, customers might be surprised by what they get, or it could trigger an inquiry from government regulators," the report quotes from Apple's App Review guidelines, point 2.3.6.
Urging a solution
The two organizations believe there are a number of places Apple should improve its processes for the "safety for its most vulnerable users."
First, it is insisted that Apple should "institute an independent third-party review and verification of the age ratings of apps before they are made available to children in the Apple App Store. This would entail the use of experts on a panel, similar to the ESRB process for game sales.
This proposal does make sense to a point, as it does introduce external opinions from experts into the process. However, this can add extra expense to the publishing process, something that will almost certainly be passed on to the developer, not Apple.
It is also proposed that Apple should make the age rating process transparent than it currently is.
There also needs to be a process for checking the appropriateness of age ratings after publication, the report adds. "Apple should act quickly to correct inappropriate ratings and take action against developers that attempt to circumvent the rating system," it states.
Lastly, there is a call for "effective parental controls," and for Apple to enforce age ratings by only allowing child users to download age-appropriate apps. This is something Apple already does with its parental controls, but evidently, there isn't enough being done for the organizations.
Apple response
In a statement shared with AppleInsider, Apple insists it already doe a lot to protect children with its existing tools. The full statement follows:
"At Apple, we work hard to protect user privacy and security and provide a safe experience for children. We do that by giving parents a wide range of capabilities they can enable on their children's devices to restrict purchases, web searches and access to apps; prevent explicit content; flag problematic content through Report a Problem; and more.
Developers are required to provide clear age ratings consistent with App Store policies, and apps designed for kids are designated in a unique category and undergo a stricter App Review process.
In instances where an app's age rating does not match its content, we take immediate action to ensure the issue is corrected."
Responsibility push
The calls for responsibility by Apple isn't something new, but it's something that it's fought against before.
In September, a report detailed how states seeking to regulate teen smartphone use felt pressure from Apple. Louisiana legislator Kim Carver, in one case, was contacted by four Apple-hired lobbyists over a bill forcing Apple to add and enforce age restrictions in the App Store, instead of relying on the app developer questionnaire.
While the App Store requirement was pulled from the bill, Carver still wants to bring it back. "I quickly realised that Apple's parental controls aren't the panacea they're promised to be," Carver said.
3 Comments
I am all for this as long as it doesn't affect privacy.
Maybe these groups should be advocating for parental responsibility and stop trying to get the government to take over the raising and “protecting” their children.
Pearl Clutching alert.