This is a huge topic, so rather than attempt any kind of a comprehensive 
response, I just want to point people to the definition we use in ISOC/Trust 
and Identity....

"Privacy is about retaining the ability to disclose data consensually, and with 
expectations about the context and scope of sharing."

Everything is in there for a purpose.

- "retaining the ability to disclose" : privacy depends on user control; if you 
don't have meaningful control over whether or not you disclose something, you 
don't control your own privacy.

 - "disclose data" : privacy is about disclosure, not about keeping all your 
data to yourself (that is secretiveness... and taken to extremes, it's a 
symptom of what we might even call paranoia) Privacy is a social construct, or 
set of conventions, relating to the disclosure of data, not to secrecy.

- "consensually" : orthogonal to the idea of control; not only is it important 
that I consent to reveal things about myself, it's also important that I 
respect others' wishes about disclosure of data relating to them. This is a key 
principle in two respects: (1) if Alice tells Bob something, her privacy 
depends on Bob not telling Carl. (2) If the only way Alice and tell Carl 
something is via Bob, Bob's role and obligations are a factor. (1) is the 
problems of "privacy beyond first disclosure", and (2) is especially true in 
today's online world, where all interactions are mediated via third parties.

- "expectations" ; note that, for global applicability, we don't use the word 
"rights"... but bear in mind that in some cultures/jurisdictions, our 
expectations about privacy may well stem from the fact that privacy is regarded 
as a fundamental right (not necessarily an unqualified one, but fundamental 
nonetheless). Expectations may be well- or -ill-founded; and they are often 
subject to being disappointed. 

- "context of sharing" : if I tell my doctor personal details in order to be 
treated, I don't expect those details to crop up on "Hilariously Embarrassing 
Ailment of the Month" website. The healthcare context may appear to be 
clear-cut, but it really is not. For example: justification for disclosure of 
healthcare data is often claimed on the grounds that it helps medical 
research... but the medical research benefits are speculative, while the 
financial benefit to research companies is not; genetic data about one person 
is intimately revealing about their parents, siblings and (potential) 
descendants; supposedly "anonymised" clinical trials data may prove to be 
easily de-identifiable... This is not argue one way or the other - just to 
illustrate that healthcare privacy is not a simple context.

- "scope of sharing" : as danah boyd puts it "publishing, to a wider audience, 
something that was intended for a narrow one may be just as much a violation of 
privacy as publishing something that was not meant to be published in the first 
place". Scope is important, and intimately related, of course, to example (1) 
under "consensus", above, and to the matter of "expectations" and "control".

The meta-issue we face, here, IMO, is that none of the questions raised by this 
definition of privacy has a single simple answer, let alone a binary one. I 
think that's why we have such a hard time translating social concepts that we 
all understand implicitly, into technical solutions that adequately reflect 
their subtlety.

Yrs.,
Robin





Robin Wilton
Technical Outreach Director - Identity and Privacy
Internet Society

email: wil...@isoc.org
Phone: +44 705 005 2931
Twitter: @futureidentity




On 19 Dec 2013, at 05:58, S Moonesamy wrote:

> Hi Joseph,
> At 07:00 18-12-2013, Joseph Lorenzo Hall wrote:
>> Many of us from academia (in my case, having recently jumped ship for
>> civil society) that study privacy are more persuaded by Helen
>> Nissenbaum's notion of privacy as "contextual integrity". Here's the
>> skiny in shorter-than-book-length form:
>> 
>> "I give an account of privacy in terms of expected flows of personal
>> information, modeled with the construct of context-relative
>> informational norms. The key parameters of informational norms are
>> actors (subject, sender, recipient), attributes (types of
>> information), and transmission principles (constraints under which
>> information flows). Generally, when the flow of information adheres to
>> entrenched norms, all is well; violations of these norms, however,
>> often result in protest and complaint. In a health care context, for
>> example, patients expect their physicians to keep personal medical
>> information confidential, yet they accept that it might be shared with
>> specialists as needed. Patients' expectations would be breached and
>> they would likely be shocked and dismayed if they learned that their
>> physicians had sold the information to a marketing company. In this
>> event, we would say that informational norms for the health care
>> context had been violated." [1]
>> 
>> Much of the scholarship these days in privacy thinking is increasingly
>> based on this kind of contextual definition of privacy (and in the
>> U.S., at least, the Obama administration embraced this in a recasting
>> of fair information principles in their Consumer Privacy Bill of Rights).
>> 
>> At CDT, we argue that "abuse" or "harm" is an anemic framing, and that
>> there are important privacy interests implicated after information has
>> been fixed and collected but before any use has been made. See
>> Brookman and Hans [2], if you're interested in reading more.
> 
> Thanks for the links.  I have to read the material again.
> 
> The health care context is relatively easier to argue for as a patient would 
> expect that his/her medical information would be kept secret.  There is the 
> legal context for privacy, e.g. see message from Nat Sakimura about torts.  
> The discussion is usually around that as the body of work is focused on the 
> legal arguments which have been made over the years.
> 
> It could be said that there are several schools of thought about the topic.  
> For example, the above (see quoted text) looks at privacy in terms of the 
> flow of information.  It may seem a better fit in the context of technology.  
> Some people will likely look at privacy in terms of harm or abuse.  It is 
> difficult to argue against that in an IETF discussion.  Privacy 
> considerations, up to now, can be described as lacking.
> 
> Regards,
> S. Moonesamy 
> _______________________________________________
> ietf-privacy mailing list
> ietf-privacy@ietf.org
> https://www.ietf.org/mailman/listinfo/ietf-privacy

_______________________________________________
ietf-privacy mailing list
ietf-privacy@ietf.org
https://www.ietf.org/mailman/listinfo/ietf-privacy

Reply via email to