Re: [ietf-privacy] Fwd: draft-huitema-dnssd-privacy-01.txt
bout the device > publishing the services. There are use cases where devices want to > communicate without disclosing their identity, for example two mobile > devices visiting the same hotspot. > > We propose to solve this problem by a two-stage approach. In the > first stage, hosts discover Private Discovery Service Instances via > DNS-SD using special formats to protect their privacy. These service > instances correspond to Private Discovery Servers running on peers. > In the second stage, hosts directly query these Private Discovery > Servers via DNS-SD over TLS. A pairwise shared secret necessary to > establish these connections is only known to hosts authorized by a > pairing system. > > > > > Please note that it may take a couple of minutes from the time of submission > until the htmlized version and diff are available at tools.ietf.org. > > The IETF Secretariat > > ___ > dnssd mailing list > dn...@ietf.org > https://www.ietf.org/mailman/listinfo/dnssd > > > > ___ > ietf-privacy mailing list > ietf-privacy@ietf.org > https://www.ietf.org/mailman/listinfo/ietf-privacy > -- Joseph Lorenzo Hall Chief Technologist, Center for Democracy & Technology [https://www.cdt.org] 1401 K ST NW STE 200, Washington DC 20005-3497 e: j...@cdt.org, p: 202.407.8825, pgp: https://josephhall.org/gpg-key Fingerprint: 3CA2 8D7B 9F6D DBD3 4B10 1607 5F86 6987 40A9 A871 ___ ietf-privacy mailing list ietf-privacy@ietf.org https://www.ietf.org/mailman/listinfo/ietf-privacy
Re: [ietf-privacy] Article, toward a definition of privacy (or perhaps not).
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 On 12/18/13 8:17 AM, S Moonesamy wrote: I suppose, to avoid confusion, it probably is better to use the definition portion of it instead of the defined word in the usual conversation. There has been some discussion on other IETF mailing lists about the definition of the word privacy. Warren and Brandeis are often cited in a U.S context. The right of personal immunity is broader than privacy. Within an IETF context it might be a problem if the right to be let alone is used. In my opinion a right is guaranteed by law and that doesn't fit in with what the IETF does. Many of us from academia (in my case, having recently jumped ship for civil society) that study privacy are more persuaded by Helen Nissenbaum's notion of privacy as contextual integrity. Here's the skiny in shorter-than-book-length form: I give an account of privacy in terms of expected flows of personal information, modeled with the construct of context-relative informational norms. The key parameters of informational norms are actors (subject, sender, recipient), attributes (types of information), and transmission principles (constraints under which information flows). Generally, when the flow of information adheres to entrenched norms, all is well; violations of these norms, however, often result in protest and complaint. In a health care context, for example, patients expect their physicians to keep personal medical information con½dential, yet they accept that it might be shared with specialists as needed. Patients’ expectations would be breached and they would likely be shocked and dismayed if they learned that their physicians had sold the information to a marketing company. In this event, we would say that informational norms for the health care context had been violated. [1] Much of the scholarship these days in privacy thinking is increasingly based on this kind of contextual definition of privacy (and in the U.S., at least, the Obama administration embraced this in a recasting of fair information principles in their Consumer Privacy Bill of Rights). At CDT, we argue that abuse or harm is an anemic framing, and that there are important privacy interests implicated after information has been fixed and collected but before any use has been made. See Brookman and Hans [2], if you're interested in reading more. [1]: http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf [2]: http://www.futureofprivacy.org/wp-content/uploads/Brookman-Why-Collection-Matters.pdf - -- Joseph Lorenzo Hall Chief Technologist Center for Democracy Technology 1634 I ST NW STE 1100 Washington DC 20006-4011 (p) 202-407-8825 (f) 202-637-0968 j...@cdt.org PGP: https://josephhall.org/gpg-key fingerprint: 3CA2 8D7B 9F6D DBD3 4B10 1607 5F86 6987 40A9 A871 -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.13 (Darwin) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQIcBAEBCAAGBQJSsbiPAAoJEF+GaYdAqahxrI4QAIuv0oLj2ypQ5lTQ99FlFz+w /V5/LCOIkV0ELJneEKJYvttZFI86bl2d2ZnZ6OH58PjFRing3/cuUAlF2aqnlRey wQnkRqijU9ULhH0sGfTXHhjlR4nHvBi6OWV8GramG9GYOZHQZObCFEyeK7UIUmuT w4whhXmCe9r78b03BoFab8QWi0bZNt6HWD7ln5MQcHk3ERX/1qg0PzqjAaLiSIDV TyoVp5cUUf9SdyA6xijQ1GRswgGUHFj62GkJ5bGX6bJhUAYbgGmdHkuYqhsumI+6 lE7+3OT5vqOl9i4iqXLXlXnilqGo6lxTf9i3uSrDtfPSNKbIQIJcQokY+rsPjQqR zS6rZ779RG0vrff/rVombLYvQE+NuQricUt2QqiFdQmVG7gZyyjupt/on7uJjjxU QSYDXYVk8hnKZJqxnlNWTOyDiRrIl25lFx47vVitD2X4oactxrIdF2bpIzvdqFoE g41cooUAIWWn5J6ZpwtuhlQj3ML8kwZqk1+Iv8vJkp6z5vRtwiScrMNSN+82TGA9 KbB87KHzCVeUoDDpHGwUyLnefSj9LXhAGuEtEacJ9KMygCjGTjeKnRulCjOXBJ5Z fEDC0KJd63PJml+kGCcabozP5NB25tMHScMY+1PoGny908IA+PmuSY2XChLLf5Fg f4pqNw75JypzxD7jX7Z7 =6UgP -END PGP SIGNATURE- ___ ietf-privacy mailing list ietf-privacy@ietf.org https://www.ietf.org/mailman/listinfo/ietf-privacy
Re: [ietf-privacy] Article, toward a definition of privacy (or perhaps not).
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 On 12/18/13 10:15 AM, Nat Sakimura wrote: Indeed the context matters especially as the consent is only given in a context. There are a lot of contexts where consent is problematic to obtain, where people simply click right through informed consent prompts, and/or where obtaining consent is directly against the public interest (e.g., public health monitoring of disease would not work very well if folks could opt-out of such data sharing). I say this because this is a big difference between the US and EU views on privacy regulation, with the EU favoring explicit, informed consent pretty heavily. I think the US view is less coherent, but would probably be characterized as consent or opt-in is required for especially sensitive contexts, demographics, and data types. Nissenbaum's book talks at length about how consent is increasingly outdated and ineffective and shifting the obligation to respect context on the data management side has many benefits, but is quite hard to police/enforce. best, Joe Also actors are very important since even identification has to do with the observer and the domain. (Otherwise, we would not have a notion such as partially anonymous, partially unlink-able. ) Also, issues around generated/inferred attributes are important. Acquired attributes + auxiliary knowledge may generate additional attributes. This is often captured as use or acquisition and implicit but is worth making note. Nat 2013/12/19 Joseph Lorenzo Hall j...@cdt.org mailto:j...@cdt.org On 12/18/13 8:17 AM, S Moonesamy wrote: I suppose, to avoid confusion, it probably is better to use the definition portion of it instead of the defined word in the usual conversation. There has been some discussion on other IETF mailing lists about the definition of the word privacy. Warren and Brandeis are often cited in a U.S context. The right of personal immunity is broader than privacy. Within an IETF context it might be a problem if the right to be let alone is used. In my opinion a right is guaranteed by law and that doesn't fit in with what the IETF does. Many of us from academia (in my case, having recently jumped ship for civil society) that study privacy are more persuaded by Helen Nissenbaum's notion of privacy as contextual integrity. Here's the skiny in shorter-than-book-length form: I give an account of privacy in terms of expected flows of personal information, modeled with the construct of context-relative informational norms. The key parameters of informational norms are actors (subject, sender, recipient), attributes (types of information), and transmission principles (constraints under which information flows). Generally, when the flow of information adheres to entrenched norms, all is well; violations of these norms, however, often result in protest and complaint. In a health care context, for example, patients expect their physicians to keep personal medical information con½dential, yet they accept that it might be shared with specialists as needed. Patients’ expectations would be breached and they would likely be shocked and dismayed if they learned that their physicians had sold the information to a marketing company. In this event, we would say that informational norms for the health care context had been violated. [1] Much of the scholarship these days in privacy thinking is increasingly based on this kind of contextual definition of privacy (and in the U.S., at least, the Obama administration embraced this in a recasting of fair information principles in their Consumer Privacy Bill of Rights). At CDT, we argue that abuse or harm is an anemic framing, and that there are important privacy interests implicated after information has been fixed and collected but before any use has been made. See Brookman and Hans [2], if you're interested in reading more. [1]: http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf [2]: http://www.futureofprivacy.org/wp-content/uploads/Brookman-Why-Collection-Matters.pdf ___ ietf-privacy mailing list ietf-privacy@ietf.org mailto:ietf-privacy@ietf.org https://www.ietf.org/mailman/listinfo/ietf-privacy -- Nat Sakimura (=nat) Chairman, OpenID Foundation http://nat.sakimura.org/ @_nat_en - -- Joseph Lorenzo Hall Chief Technologist Center for Democracy Technology 1634 I ST NW STE 1100 Washington DC 20006-4011 (p) 202-407-8825 (f) 202-637-0968 j...@cdt.org PGP: https://josephhall.org/gpg-key fingerprint: 3CA2 8D7B 9F6D DBD3 4B10 1607 5F86 6987 40A9 A871 -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.13 (Darwin) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQIcBAEBCAAGBQJSsb2cAAoJEF+GaYdAqahxVccQAL0TfywfHMna0kGRyuSoH7lz GIgrBgh2vA195IijqFzKjKv0bZ4YAGOPksjwuUWnumSHgiqbtArOk3t664NH6hlO M6IV7dAc31f4wVFU05Fn