In Times of Distrust, Innovation and Collaboration will be Key 

By Angelique Carson, CIPP/US 



The Internet has become a prison. A prison in which the warden can see all of 
the prisoners, but none of the prisoners can see each other, or the warden. 
Because what Silicon Valley knows how to do best is collect user data without 
notifying the user it’s doing so, and for what purpose, and then sell it for 
profit. But it shouldn’t be that way, and it doesn’t have to be. 




That’s according to Michael Fertik, founder and CEO of Reputation.com , 
speaking in San Francisco’s sun-cloaked Moscone Center—somewhat appropriately 
located directly across from a Target store, as one RSA Conference attendee 
tweeted—where information security professionals packed into a conference room 
to engage with expert panelists at an IAPP privacy training session on the 
myriad threats to privacy facing anyone who touches data today. 




It’s an incredibly relevant time to have such a conversation, Fertik said, 
because we’re in a “Michelangelo” moment in privacy. He was referring to the 
1991 computer virus phenomenon that sent waves of panic across the infosec 
world and eventually hit even mainstream media, creating drama that perhaps 
outstripped the virus’ impact, but set in motion an eventual ecosystem of 
antivirus software that proliferated and even policies that began to bolster 
the shift. 




Fast-forward to 2014 and, no matter what one’s politics, it’s hard to deny the 
worldwide hype surrounding privacy and surveillance, thanks to Edward Snowden. 
“We are finally getting awareness to the general public,” Fertik said. 




However, while the masses are finally getting up-to-speed on data collection 
and how the Internet actually works, one of the great remaining myths of the 
Internet is that “it’s about you,” Fertik said. “It’s not. You are not the 
subject or the verb in this sentence; you are the object.” 




But that can change, and infosec professionals can play a key role, Fertik 
said. And maybe in this scenario, as in the case of the rise of antivirus 
software of the 90s, we’ve landed a net positive. After all, innovative 
technology is rising to the occasion; just look at the number of start-ups 
responding to the public disdain over government spying: Silent Circle and 
Snapchat as alternatives to traditional e-mail and photo-messaging, for 
example. 




Or perhaps, even better, the answer may lie in a shift on behalf of companies 
already up-and-running to include users on decisions involving monetizing data. 




Panelist Jules Polonetsky, CIPP/US, co-founder of the Future of Privacy Forum, 
said the ambiguity lies in drawing lines in the sand. Where are the boundaries 
when it comes to user expectations? 

It’s hard to say. 




“ The majority [of users are] not looking for privacy,” Polonetsky said. 
“They’re looking to not get screwed. They are not even thinking of it as 
privacy. They want to know, ‘Is there a security concern? What are you going to 
do with [my data]?’ … But there’s a lot of wiggle room in that middle ground 
for us to give people a fair deal and treat them well,” he said. 



For many brands, though, it’s hard to even define what would be a fair deal to 
the consumer, because it’s never one-size-fits-all when it comes to privacy , 
said panelist Anne Toth, co-founder at Cover Media Labs and former head of 
privacy and policy at Google and Chief Trust Officer at Yahoo. 

“ Privacy is highly elastic,” Toth said. “It means different things to 
different people . As a business, what you’re trying to do is manage consumer 
expectations . Because a ‘privacy violation’ is when my expectations have been 
violated: ‘I thought you did this, but in fact you did that.’” 



Adding to the problem is that there’s no efficient way to tell consumers 
exactly what your brand is doing with all of their data , and, compounding 
that, government is persistently asking brands for huge data troves—as we now 
know. But that’s nothing new, she said. Such data aggregation has been 
happening for a long time. 




How Do Brands Establish Trust? 

Attorney Stan Crosley, CIPP/US, CIPM, currently of Drinker, Biddle and Reath, 
agreed, adding that expectations and definitions come from consumers’ life 
experiences, and privacy harm is very hard to nail down as a concept. 




“People understand if someone takes their wallet and walks way, that’s a 
tangible harm; they’ve just lost their money,” he said. “But what is really at 
risk if they’re on a website? So someone sends them annoying e-mails. There is 
opt-out,” he said of lax consumer attitudes. 




All of this is to say that the current model for communicating with consumers 
is clearly broken, the panel agreed. Crosley noted that there’s a complete 
disconnect between the law and consumer expectations, and even he writes 
“awe-inspiring” privacy notices that never get read. “Unless my client gets 
sued,” he noted, which means privacy folks, in the end, are essentially writing 
privacy policies for the regulators, not the consumers. 




“Now, how do you get to transparency?” he asked the crowd. “How do you get to 
helping people understand what’s being done so they can make choices that are 
worthwhile?” 

Toth agreed that it’s complicated but said it comes down to trust . That means 
consumers typically make choices based on brands they’re comfortable with. But 
that can get tricky because the company a consumer one day makes an agreement 
with may not always remain the same company in the end in this day of mergers 
and acquisitions. She noted Facebook’s recent acquisition of WhatsApp. Did the 
consumer who agreed to WhatsApp’s policies know that one day that would mean 
they were agreeing to Facebook’s policies? 

Compounding the issue, she said, is that trust relationships aren’t built in a 
day . 




“ They are built over a series of interactions over a long period of time . You 
can’t ask users to fill out a form three pages long to get to free web-based 
e-mail. Why do you need to know my age, my profession? You don’t,” Toth said. “ 
You need to contextualize the information. Having a privacy policy is a legal 
thing. You need to have one, but don’t rely on that to build relationships on 
trust . It’s about all of those interactions when you say, ‘ I need that 
information so I can do this for you .’ You need to give indications as to why 
you’re doing it and make them understand it in a way that is relevant and 
contextual to them.” 




The Future? Collaboration, Innovation 

So how do we solve these problems? 

Polonetsky said it’s up to the IT professionals in the room and beyond. 

“I’m optimistic that smart technologists are going to help design systems that 
users want,” he said, adding that it’s going to come down to UI-design . If 
technology is capable of itself driving a car or flying a plane, “ we need to 
at least make a browser that lets me understand what’s happening with my data 
,” he said. 

But Crosley was quick to say it’s important not to confuse “control” with data 
privacy. 

“It’s incredibly unethical for companies to rely on you to decide what you 
think is the best or most harmful use of your data, when many entities have a 
far greater understanding of how that data can be used for good or harm,” he 
said. “ Privacy shouldn’t just be about control .” 

Toth said use-regulation is the only answer to a world where data is 
ubiquitous. 

“We have to change our model to appreciate that. We are quickly approaching a 
world where we are going to have to be making some really hard choices about 
the benefits we want and exactly how we’re going to regulate and manage that,” 
she said. To do that, infosec professionals and privacy professionals will need 
to get on the same page. 

-- 
Stacy Martin 
Senior Manager, Privacy and Engagement 
Mozilla 
2 Harrison Street, Suite 700, San Francisco, CA 
916-390-4845 (cell) 
[email protected] 

_______________________________________________
dev-privacy mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-privacy

Reply via email to