The American Civil Liberties Union has raised concerns over the Bluetooth-based contact tracing tool that Apple and Google are collaborating on, citing that the move could invade user's privacy— if it even works at all.
The American Civil Liberties Union (ACLU), has released a statement on the joint COVID-19 contact-tracing collaboration between Apple and Google. In the report, they outline concerns about the efficacy and practicality of such technology, as well as how tracking apps could be used to identify those who use them personally.
Their main concern is that of adoption. Experts say that 60% of people would need to adopt the technology for it to be effective. Many people, though, may not trust a device that aims to track everywhere they go, especially if the data were easily able to be traced back to them.
The ACLU proposes rather than store information on a server, the data should be stored locally on a user's phone. Additionally, they worry that Bluetooth tracking may not be accurate enough to deem what is an epidemiologically relevant contact.
Google and Apple are jointly assuring potential users that the list of people a user comes into contact with is only stored locally on a device and isn't shared unless they opt to share it, such as after a positive diagnosis. The actual identities of people who test positive for COVID-19 aren't revealed to Apple, Google, or other users, and the companies can disable the system on a regional basis when it is no longer needed.
The ACLU has proposed a list of technology principles that users, policymakers, and developers can judge contract tracing apps. The ACLU's core tenets propose that a user must have control over their data, demand the ongoing protection of a user's privacy, and require the apps to obtain a user's consent at multiple stages. They also make it clear that the app should never be used for punitive or law enforcement purposes under any circumstances at all.
Google and Apple assure users that the program has been built from the ground up to respect strong privacy policies. No location data or personally identifiable information is collected as part of the system, and each device's Bluetooth identifier will change periodically to prevent unwanted tracking.
When implemented, the technology will use a device's onboard Bluetooth hardware to keep tabs on who the owner comes into close proximity with. Specifically, Bluetooth identifiers are exchanged and saved locally. Under the current proposal, the Bluetooth identifiers provide 24 hours of linkable data, which the ACLU deems unacceptable, as users cannot choose to redact location information for certain times of the day.
The Google and Apple joint contract tracing partnership has been both praised and scrutinized by the Trump administration and the president himself, with him noting that the system is "amazing" but raises "big constitutional problems." Trump failed to specify what specific concerns exist about Apple and Google's system, however.
20 Comments
You know what else encroaches on civil liberties? Death. Death encroaches on your civil liberties.
Sounds like the Apple/Google plan ticks every one of the ACLU points, other than this one which would be impossible without compromising both the efficacy of the data and users privacy:
"Under the current proposal, the Bluetooth identifiers provide 24 hours of linkable data, which the ACLU deems unacceptable, as users cannot choose to redact location information for certain times of the day." The ACLU truncated their thought process.
As another publication put it "users cannot review data prior to upload. This should, (the ACLU) believes, offer a second opportunity for app users to review the contacts and delete any that did not carry any exposure risk... it (also) says that it isn’t satisfied that the amount of data captured can’t be used to identify people.
...These latter two points are effectively impossible to implement, however. Users can’t review the contacts recorded because the whole point of using Bluetooth codes is that individuals cannot be identified. So a user would have no way of knowing which codes to redact. And you cannot reduce the data without compromising the ability to identify exposure.
It would technically be possible to allow a user to exclude false contacts. For example, there could be a toggle that allows us to say we are alone in a room or vehicle, even if there may be people the other side of a thin wall or outside our sealed car. However, the more you rely on people manually toggling things on or off, the less reliable the apps would become."
ACLU cares about civil liberties? How quaint of them. I agree that Apple/Google appear to have created a system that safe guards individual freedoms. There are no constitutional issues if it is voluntary. The step I worry about is not having to show proof you are positive to say you are infected.
What the hell is wrong with these people? Working on something like this is absolutely vital in the fight against this virus. Like everyone who uses smart devices, computers and most apps don’t already have their privacy at risk already. Thankfully Apple is working on it along with Google where they concerns would be more than warranted. Other countries have implemented this approach and they are definite privacy concerns. But I feel a lot safer with Apple on board.