Apple CEO Tim Cook has met up with U.S. lawmakers to lobby against sections of an online child safety bill, which would affect the way the App Store deals with ages.
Lawmakers in the United States are in the middle of discussions about online child safety legislation, which could force some big changes in the operation of apps, online services, and app storefronts. Keen to avoid Apple dealing with a problem brought about by legislation, CEO Tim Cook waded into the fray on Wednesday.
The CEO took part in a closed-door meeting with members of the House Energy and Commerce Committee, reports Bloomberg, where he talked about the potential issues with the new legislation.
One of the bills is the App Store Accountability Act, which will require App Store owner Apple and Google Play Store controller Google to verify the ages of all users. The intention is to prevent minors from accessing potentially harmful apps.
Cook is keen to avoid the bill going through, as that would force Apple to check a child's age. However, Apple's existing Family Sharing system already requires an adult to manage an Apple Account for children.
Under Apple's existing program, the parent or guardian has their age checked by the system. When it comes to the child accounts, Apple takes the word of the account in charge of the Family Sharing group.
As well as the Cook intervention, Apple's global head of privacy, Hilary Ware, sent a letter to the committee, concerned about the threat it poses to all App Store users.
This is not the only bill that Apple has to deal with. The committee is set to talk about the ASA and a second bill that is closer to what Apple wants on Thursday.
Cook has previously used his personal outreach technique on the topic. In May, he personally called Texas Governor Greg Abbott, urging him to veto legislation forcing the App Store to verify ages.
A risk for accountability
One of the problems with the ASA is that it also puts the responsibility for those age checks onto Apple and Google, and away from the app developers and service operators.
While an app or service could be misused or run in a way where a child could see age-inappropriate content, the blame would be leveled at Apple and Google instead.
For example, if Facebook's moderation team failed to block explicit images from appearing on the social network, it would only be partly culpable. Apple would still be blamed since it performed the initial age checks and allowed a download of an app to begin with.
Despite Apple potentially working to the letter of the law, it could still be accused of wrongdoing.
The shift in responsibility may even allow app creators and online services to be more careless with their moderation activities. With blame moved up the food chain, there would be little incentive to improve content policing.







