On 2019-01-28 19:46, billol...@gmail.com wrote:
> On Monday, January 28, 2019 at 10:27:32 AM UTC-5, gold...@riseup.net wrote:
>> On 2019-01-27 19:15, billol...@gmail.com wrote:
>> > On Sunday, January 27, 2019 at 12:22:03 PM UTC-5, unman wrote:
>> >>[snip]
>> >> Qubes provides a framework for using software - it doesn't take away the
>> >> onus on users to use that software properly, and to ensure they are aware
>> >> of good practice.  (As an aside I'm always baffled by people querying
>> >> how they can use Facebook under Tor or Whonix. What are they thinking?)
>> >> I regularly audit templates with tripwire, running from an
>> >> offline openBSD qube, and do standards checks with debsums. I do good
>> >> deal of my work offline in openBSD and then transfer encrypted in to
>> >> other qubes for transmission. That seems like overkill, and isn't for
>> >> everyone: it might be for you.
>> >>
>> >> unman
>> >
>> > I think this is the most important thing you wrote. I used to do
>> > network security for a small scientific government network back in the
>> > 1990s, and I constantly ran into the problem that there is an inverse
>> > relationship between security and usability.  The scientists on my
>> > network were constantly finding ways of going around whatever security
>> > measures I put in place precisely because they didn't want to deal
>> > with the "hassle."
>> >
>> > But I'm no different, really.  Not too many years ago, I routinely
>> > disabled SELinux when I installed a new OS simply because I considered
>> > it too much of a hassle to learn how to use it effectively.  It made
>> > it difficult for me to do stuff.  Everybody yelled at me, but it just
>> > wasn't worth the effort to me. Now, I've learned it a bit and it's not
>> > such a hassle.
>> >
>> > There's this balance between the inconvenience/damage associated with
>> > an intrusion versus the inconvenience associated with the security
>> > configuration.  For me on the computer I'm using as I write this, it
>> > wouldn't be the end of the world if *everything* on my computer were
>> > owned by someone else.  It would be a hassle, but not fatal -- I have
>> > insurance, etc. for the financial information I have here, and I don't
>> > really care if someone sees the email conversations I have on this
>> > machine.
>> >
>> > So, considering the financial stuff, for instance.  There's a hassle
>> > with someone getting my credit card information.  It's happened
>> > (though not because of a computer glitch).  My card gets frozen, I
>> > can't use it for a week or two, I have to make a bunch of phone calls,
>> > etc.  But I'm financially protected and eventually I'll be fine.  The
>> > problem is the hassle factor, not financial ruin. My biggest security
>> > concern is someone using up all my bandwidth; I live in a rural area
>> > and have metered service.  Someone using up 5 gigs of bandwith is more
>> > concerning to me than them owning 5 gigs of data from my machine. So,
>> > I have to ask myself, which is more hassle -- dealing with the
>> > intrusion, or dealing with the security hassle?
>> >
>> > It's my responsibility to determine where that balance is, and nobody
>> > else's.  And it's likely different for everybody.  For instance, I
>> > used to have a blog, but I'm a litigation consultant and I started
>> > seeing my blog posts turning up in court.  So I don't blog any more. I
>> > can't be on Facebook, or LinkedIn, or Doximity, or ResearchGate.
>> > That's not a problem for me, but it would drive my wife crazy.  I use
>> > one laptop for some stuff, and I use a different laptop, differently
>> > configured, for other stuff.
>> >
>> > And, the higher up the food chain you go with respect to people
>> > interested in surveilling you, the less chance you have of keeping
>> > them out.  I'm out of the business now, but back in the day I
>> > occasionally did some classified work. I remember some years ago, I
>> > called a friend of mine who worked for the government.  I called him
>> > using the work phone of an acquaintance to ask him a technical
>> > question.  He picked up the phone and immediately said "Hey, Bill, how
>> > you doing?"
>> >
>> > I was stunned. I asked him how the hell he knew it was me.  He said
>> > "Bill, I'm with the <government agency>.  We always know where you
>> > are."
>> >
>> > I have another friend who spent his early career working for a
>> > government contractor.  His job was to break into people's houses at
>> > night and install keyloggers on their computers. With a subpoena, of
>> > course.  All the security software in the world won't help you with
>> > that.
>> >
>> > The key, for me, is to achieve the maximum security that I can achieve
>> > and stay below my maximum hassle tolerance.  Qubes is nice because it
>> > adds a big uptick in transparent security with only a small uptick in
>> > hassle -- at least for someone who is fairly conversant with sysadmin
>> > stuff.  So for me it's a big win. But it's not all there is.
>> >
>> > There's no such thing as perfect security.  There's only finding the
>> > balance between one's perceived risk, one's actual vulnerability, and
>> > one's tolerance for hassle.  And any security configuration is
>> > self-defeating if:
>> >
>> > 1) People take it for granted and think that it's all they have to
>> > think about, and/or
>> > 2) It's enough of a hassle that you start going around it to do your work.
>> >
>> >
>> > billo
>>
>>
>> To Billollib.
>>
>> To sumarise your post; 1/ It's the Users of software that subvert OS's
>> and/or Software. 2/ I've nothing to hide so why bother.
>> Both topics are, I feel, are hijacking this post; which concerns the Apt
>> vulnerability within Debian. That's not to say your points aren't
>> important, they are.  And, for that reason I'll provide a response.
>> Which is based largely on the opinions of recognised experts in thier
>> fields:
>>
>> 1/ Users are to blame. It's that classic argument put forward by a
>> minority of software developers; I've created a wonderful system and its
>> the users who subverted it. We've seen this approach from the likes of
>> Mr Zuckerburg who blames the users of Facebook for allowing their
>> privacy to be invaded. In reality of course his software is carefully
>> crafted to covertly exploit users privacy and then maximise revenue
>> streams by covertly selling their  data to advertisers and politcal
>> lobbyists; e.g. Cambridge Analyitica. I and many others believe that
>> Twitter, Microsoft, Apple et al operate similar business models based on
>> stealing data and up-selling it. Of course when a data breach is made
>> public their highly paid "spin doctors" will invariably concoct are
>> yarn; blaming users or anyone else they can think of and then send it to
>> their buddies in main stream media for publication.
>>
>> 2/ I've nothing to hide so why bother. It's this submissive, cavalier
>> and defeatist approach to online privacy that's regularly promoted, in
>> the form of propaganda by virtually all main stream media outlets on
>> behalf of their owners; a small clan of very rich "global elites"; who
>> attempt and largely succeed in exerting control and influence over the
>> masses; i.e. us peasants.
>>
>> I think the most appropriate and succinct reply I've seen is: "If you
>> think privacy is unimportant for you because you have nothing to hide,
>> you might as well say free speech is unimportant for you because you
>> have nothing useful to say".
>>
>> You might also wish to read this in depth response:
>> https://www.aclu.org/blog/national-security/secrecy/you-may-have-nothing-hide-you-still-have-something-fear.
>>
>> Last but certainly not least, Here's a quote from that renowned privacy
>> supporting journalist Glynn Greenwald; the guy who broke the Ed Snowden
>> revelations. "Over the last 16 months, as I've debated this issue around
>> the world, every single time somebody has said to me, "I don't really
>> worry about invasions of privacy because I don't have anything to hide."
>> I always say the same thing to them. I get out a pen, I write down my
>> email address. I say, "Here's my email address. What I want you to do
>> when you get home is email me the passwords to all of your email
>> accounts, not just the nice, respectable work one in your name, but all
>> of them, because I want to be able to just troll through what it is
>> you're doing online, read what I want to read and publish whatever I
>> find interesting. After all, if you're not a bad person, if you're doing
>> nothing wrong, you should have nothing to hide." Not a single person has
>> taken me up on that offer". By: Glenn Greenwald in Why privacy matters -
>> TED Talk
> 
> Your summary is incorrect, and you do my a disservice by misstating
> what I wrote and then arguing against your misstatements.
> 
> I did not say "I don't have anything to hide."  What I wrote, if you
> reread it carefully, is that I modulate my security demands according
> to the balance of useability and security.  You will note that I said
> that I use multiple computers, each with different security
> configurations.  The stuff I "have to hide" I put on a different
> computer than this one.
> 
> It may be your choice to put all your eggs in one basket.  I do not. 
> That doesn't mean my position is that "I don't have anything to hide."
>  What I do is I recognize that any outward facing box is a threat. 
> Any outward facing box that you use to surf the web, try out new
> software, and generally play around with is a greater threat.   What I
> wrote was that I don't put my crown jewels on the box that it likely
> to get compromised.  That is *very* different that saying "I don't
> have anything to hide" or "I don't need security."
> 
> So you don't need to lecture me on whether or not I need security. 
> You need to realize that all threats are not the same, and all
> security requirements for all places are not the same.  That's why
> security is different when you enter your average grocery store vs
> your average courthouse vs your average military base vs the FBI
> headquarters vs Langley.  And my security requirements are different
> for this machine than they are for the machine, for instance, that I
> keep medical or legal information on, or my Android phone.  I don't do
> things using my smartphone that I don't want someone surveilling. 
> It's not that "I don't have anything to hide, so why bother."  It's "I
> know there's nothing I can do to make it secure, so I won't do things
> on it that I don't want surveilled."
> 
> And I'm not "blaming" anybody.  What I'm saying is that ultimately
> every person has responsibility for their own security.  And if you
> don't believe that, you will never *have* good security, because you
> will always be standing in the ashes of your privacy complaining that
> someone else didn't take care of it for you.
To Billollib
 
 First, Its disappointing you didn't apologise for hijacking my thread.
 
 Second, you complain I misrepresented you in my summary. Perhaps you
forget writing the following: " I used to do
> > network security for a small scientific government network back in the
> > 1990s, and I constantly ran into the problem that there is an inverse
> > relationship between security and usability.  The scientists on my
> > network were constantly finding ways of going around whatever security
> > measures I put in place precisely because they didn't want to deal
> > with the "hassle."
My summary of the above was: "It's the Users of software that subvert
OS's.....". I think that's a fair summary of what you said about Users -
in this case scientists.

Finally: Please, no further hijacking of other peoples posts

-- 
You received this message because you are subscribed to the Google Groups 
"qubes-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to qubes-users+unsubscr...@googlegroups.com.
To post to this group, send email to qubes-users@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/qubes-users/91fcc54e51eb170ce554693f509539a0%40riseup.net.
For more options, visit https://groups.google.com/d/optout.

Reply via email to