April 2020
Zoom Security: The Good, the Bad, and the Business Model (2 April 2020)
Zoom Cryptography and Authentication Problems (4 April 2020)
Trusting Zoom? (6 April 2020)
Is Zoom's Server Security Just as Vulnerable as the Client Side? (13 April 2020)
In Memoriam: Joel Reidenberg (22 April 2020)
The Price of Lack of Clarity (26 April 2020)
Software Done in a Hurry (29 April 2020)

The Price of Lack of Clarity

26 April 2020

As anyone reading this blog assuredly knows, the world is in the grip of a deadly pandemic. One way to contain it is contact-tracing: finding those who have been near infected people, and getting them to self-quarantine. Some experts think that because of how rapidly newly infected individuals themselve become contagious, we need some sort of automated scheme. That is, traditional contact tracing is labor-intensive and time-consuming, time we don’t have here. The only solution, they say, is to automate it, probably by using the cell phones we all carry.

Naturally, privacy advocates (and I’m one) are concerned. Others, though point out that we’ve been sharing our location with advertisers; why would we not do it to save lives? Part of the answer, I think, is that people know they’ve been misled, so they’re more suspicious now.

As Joel Reidenberg and his colleagues have pointed out, privacy policies are ambiguous, perhaps deliberately so. One policy they analyzed said

  1. “Depending on how you choose to interact with the Barnes & Noble enterprise, we may collect personal information from you…”
  2. “We may collect personal information and other information about you from business partners, contractors and other third parties.”
  3. “We collect your personal information in an effort to provide you with a superior customer experience and, as necessary, to administer our business”
“May”? Do you collect it or not? "As necessary"? “To administer”? What do those mean?

The same lack of clarity is true of location privacy policies. The New York Times showed that some apps that legitimately need location data are actually selling it, without making that clear:

The Weather Channel app, owned by an IBM subsidiary, told users that sharing their locations would let them get personalized local weather reports. IBM said the subsidiary, the Weather Company, discussed other uses in its privacy policy and in a separate “privacy settings” section of the app. Information on advertising was included there, but a part of the app called “location settings” made no mention of it.

Society is paying the price now. The lack of trust built up by 25 years of opaque web privacy policies is coming home to roost. People are suspicious of what else will be done with their data, however important the initial collection is.

Can this be salvaged? I don’t know; trust, once forfeited, is awfully hard to regain. At a minimum, there need to be strong statutory guarantees:

and this needs to be as iron-clad as a battalion of lawyers can make it.

I don’t know if even this will suffice—as I said, it’s hard to regain trust. But passing a strong Federal privacy law might make things easier when the next pandemic hits—and from what I’ve read, that’s only a matter of time.

(There’s a lot more to be said on this topic, e.g., should a tracking app be voluntary or mandatory? The privacy advocate in me says yes; the little knowledge I have of epidemiology makes me think that very high uptake is necessary to gain the benefits.)

Tags: privacy