Really good thoughts, Bryan - and much enlivened by the appearance of the word 
"hallucinate" in your text. (I'd love to think it was caused by hallucinogens, 
but probably the more prosaic auto-correct… ;^(  )

Yes, there's a whole set of knotty issues around how to ensure that users are 
adequately informed, how to capture consent appropriately without inducing 
'consent fatigue', and so on.

One suggestion I picked up recently from a design workshop was this (by analogy 
with other 'user behaviour' challenges such as healthy eating): "what can we do 
to help users adjust their privacy-related values, so that when presented with 
a privacy choice, they make better decisions about online behaviour?". (If you 
replace "privacy" with "dietary" and "online" with "eating", you'll get where 
this came from…).

What I think is useful about this approach is that it assumes that neither 
better information (food labelling/privacy icons) nor consent mechanisms (diet 
is, for these purposes, an entirely discretionary self-motivated activity) are 
enough to change user behaviour; the vital step is to change the value the user 
places on their own privacy. 

R

Robin Wilton
Technical Outreach Director - Identity and Privacy
Internet Society

email: wil...@isoc.org
Phone: +44 705 005 2931
Twitter: @futureidentity




On 4 Sep 2012, at 11:32, Bryan McLaughlin (brmclaug) wrote:

> A few thoughts were stirred by this exchange
> 
> That privacy is  not a  physical thing and so it can/will be challenging to 
> educate consumers about what their choice are and mean to them. 
> 
> I think the cognitive load on consumers will present  a barrier to  gaining 
> consent that represents what  it purports to be - 'informed'. Therefore a 
> position that starts from least harm  to a user's privacy seems appropriate. 
> So without informed consent a  default action  to not processing the relevant 
>  data would seem correct
> 
> Reducing the  cognitive load could be approached initially by reducing the  
> frequency a  user has to go through  privacy agreements? I hallucinate this 
> would  be helpful in this regard?
> 
> How  one creates a personal policy that can be utilised for  such interaction 
> and a framework for using this widely  may in itself be a challenge?.
> 
> Bryan
> 
> 
> 
> -----Original Message-----
> From: ietf-privacy-boun...@ietf.org [mailto:ietf-privacy-boun...@ietf.org] On 
> Behalf Of Robin Wilton
> Sent: 04 September 2012 10:33
> To: S Moonesamy
> Cc: ietf-privacy@ietf.org
> Subject: Re: [ietf-privacy] draft-moonesamy-privacy-identifiers-00
> 
> This is (again) an excellent airing of the issues, I think. One theme it 
> exposes is the difficulty of balancing two factors:
> 
> 1 - achieving informed consent, when the target audience doesn't have a 
> mature understanding of the problem, or isn't motivated to act on such 
> understanding as they have;
> 
> 2 - dealing with stakeholders who react as some did to Microsoft's "DNT by 
> default" decision... i.e. by saying 'if you set a privacy feature to 'on' by 
> default, it is not reliable because it can't be interpreted as an explicit 
> user choice (and hence as an indication if consent).
> 
> I like your point about design never being value-neutral... Wondering if 
> there's a sense in which designers can acknowledge that and say "of course 
> not; and these privacy-enhancing design values are legitimately preferable to 
> those privacy-eroding ones"...
> 
> Yrs.,
> Robin
> 
> Sent from my iPod
> 
> On 3 Sep 2012, at 18:55, S Moonesamy <sm+i...@elandsys.com> wrote:
> 
>> Hi Hannes,
>> At 08:17 03-09-2012, Hannes Tschofenig wrote:
>>> There are more documents that exist making this statement.
>> 
>> Yes.
>> 
>>> In the technical community there is no need to convince anyone that an IP 
>>> address can be used to indirectly identify a person (typically through 
>>> various database resolution steps).
>> 
>> Agreed.
>> 
>> The idea is also to show two different viewpoints, i.e. the technical side 
>> and the non-technical side.
>> 
>>> As such, I don't think there is a need to cite anything here.
>> 
>> Ok.
>> 
>>> Interesting. I thought that this is already fairly good.
>> 
>> I would rate it as good as I don't see any other way.
>> 
>>> The typical reader of an IETF draft is certainly not an average person. In 
>>> fact it is probably a good thing that they do not read them...
>> 
>> :-)
>> 
>>> Well. You can hide your IP address to a certain text. What gets hidden with 
>>> the IP address depends whether you use systems like Tor, VPNs, NATs, IPv4 / 
>>> IPv6, etc.
>> 
>> See my comment to Robin Wilton.
>> 
>>> You may not ask "Do you want to reveal your IP address? YES / NO". However, 
>>> what a user may want to do is to get a software it has confidence in that 
>>> it preserves a certain degree of anonymity. They may, for example, download 
>>> Tor (or a similar software from other groups).
>> 
>> There is a somewhat related message about a question asked to a user at 
>> http://lists.w3.org/Archives/Public/public-tracking/2012Jul/0152.html Which 
>> question to ask is a question in itself.
>> 
>>> Then, these software developers should still be given the option to provide 
>>> those users who care about their privacy to set the options correctly so 
>>> that they get what they want (potentially with an impact on the service 
>>> quality).
>> 
>> That gets us into an "inevitable choice" discussion.  One of general design 
>> questions is about considering performance and cost as well as 
>> functionality.  If the difference in service quality is negligible, I'd say 
>> that it is a 50/50 choice.  If the difference is noticeable, I'd pick the 
>> 50/50 choice too as nothing comes for free.  If service quality is like 
>> degraded mode the software developer does not have to worry too much about 
>> privacy being part of the design.  It has been argued that there is no such 
>> thing as a value-neutral design.  Decisions, whether it is about security or 
>> anything else, injects a bias into the equation.
>> 
>> 
>>> I don't think that we are questioning the Internet architecture as such.
>>> 
>>> For example, with onion routing you essentially have various forms of 
>>> trusted intermediaries. With IPv6 and with MAC address generated interface 
>>> identifiers you have a much stronger form of re-identification in IETF 
>>> protocols than you had before and so it is OK to think about the privacy 
>>> implications.
>> 
>> Ok.
>> 
>>> That's true. The idea is that the open standardization process in the IETF 
>>> leads to technologies that exercise data minimization because different 
>>> stakeholder with different interests participate in the standardization 
>>> activity and therefore avoid excessive and unnecessary sharing of data. 
>>> This of course assumes that there is some understanding of what privacy is 
>>> and what goals we may want to accomplish.
>> 
>> The above looks at this from a different angle.  I am not disagreeing by the 
>> way.  The wall is the asymmetry of power.  We can go around that wall with a 
>> process where the various interests work together on finding a balance.  We 
>> could also move that wall around, or rather, move the boundaries so that the 
>> problem can be less complex.
>> 
>>> Definitely. The contextual nature makes it very difficult for certain 
>>> protocols to make a clear cut. Then, there are other ways to deal with this 
>>> situation - as we had been trying to do in Geopriv with user preference 
>>> indications.
>> 
>> There has been some nice work coming out of Geopriv.
>> 
>>> Many of the privacy laws a build on the basis that someone is able to make 
>>> that decision - either it is the end user or someone on his / her behalf. 
>>> Think about children. Typically, you would assume that their parents make 
>>> that decision. Then, in many cases we (you and me) also use tools to 
>>> outsource some of the decision making and to delegate it to people we 
>>> trust. For example, if you download Ghostery or Adblocker for usage with 
>>> your browser you decide to rely on specific companies to decide what to 
>>> block and what not.
>> 
>> Privacy laws vary, i.e. it depends on the culture of the region.  In some 
>> regions it is left to the individual to make the decision.  In some regions 
>> society is viewed as having a responsibility in helping the individual make 
>> the decision.  In some respect, it could be about delegation of trust.
>> 
>>> That's a good point. We had been discussing this issue while working on the 
>>> privacy considerations document and we said that we wouldn't mandate 
>>> anything (particularly since the enforcement would only come through the 
>>> community and through the IESG - not the IAB). Particularly at the early 
>>> phase of our work on the privacy guidelines we weren't quite sure whether 
>>> the level of privacy understanding is actually mature enough.
>> 
>> Agreed.
>> 
>>> However, some time has passed and the privacy guidelines document is stable 
>>> now. So, it might be interesting to hear what the community thinks.
>> 
>> Yes.
>> 
>> Regards,
>> S. Moonesamy 
>> _______________________________________________
>> ietf-privacy mailing list
>> ietf-privacy@ietf.org
>> https://www.ietf.org/mailman/listinfo/ietf-privacy
> _______________________________________________
> ietf-privacy mailing list
> ietf-privacy@ietf.org
> https://www.ietf.org/mailman/listinfo/ietf-privacy

Attachment: smime.p7s
Description: S/MIME cryptographic signature

_______________________________________________
ietf-privacy mailing list
ietf-privacy@ietf.org
https://www.ietf.org/mailman/listinfo/ietf-privacy

Reply via email to