Adam,

There is a difference between telling someone "you should *trust* this
software" and telling them "this software is probably going to work for you
because of X Y & Z."

I feel like you are conflating two different issues. I firmly believe you
should *never* just *trust* encryption software that is not open to
independent auditing at *any time.*

However, we don't live in an open source utopia yet, so yes, we make
judgement calls based on what information *is* available to the public. But
I think you're making a bit of a tempest in a teapot here.

(Yes I realize I am possibly the last person who should be making such
comments, though I'm trying to be better about it.)

Whether or not code *IS* secure is not the issue. It is whether or not you
should *TRUST* code that cannot be *VERIFIED SECURE* and verified
*INDEPENDENTLY AT ANY TIME*.

You might believe Apple or Google are secure, in fact I would be willing to
believe Facebook is doing its damnedest to keep their servers and users
data secure, **within their closed paradigms** which may or may not line up
with my needs as an individual user at any given time. And I can't engage
in informed consent in that process, except where I consent that I do not
get to know Corporation X's paradigm.

regards

Brian

PS even crypto-gods are fallible. and that's not a bad thing, its just
human nature.

On Tue, Feb 19, 2013 at 10:00 AM, Adam Fisk <a...@littleshoot.org> wrote:

> On Fri, Feb 15, 2013 at 2:01 PM, Nadim Kobeissi <na...@nadim.cc> wrote:
> > On Fri, Feb 15, 2013 at 4:35 PM, Adam Fisk <af...@bravenewsoftware.org>
> > wrote:
> >>
> >> I'm certainly more confident in the overall security of silent circle in
> >> its first release than I was in the overall security of cryptocat.
> >
> >
> > Of course this is true. The first release of Cryptocat was made in early
> > 2011 by me back when I was in my second year of university and only
> barely
> > beginning to understand proper programming and security practice. It was
> an
> > experimental product full of holes and by no means secure. The first
> release
> > of Silent Circle was by a team of superheroes with 25 years of
> experience in
> > being totally badass. Big difference!
>
> That's really my point exactly -- there are many things that determine
> the security of a piece of software.
>
> >
> > But when your model is closed-source, you're not participating in
> > reviewable, verifiable security practice and you're negatively affecting
> the
> > practical cryptography industry as a whole. Look at Cryptocat — it
> > progressed from a toy into a real product that I'm proud of, and that
> fully
> > passed a security audit with a 100/100 score just last week
> > (
> https://blog.crypto.cat/2013/02/cryptocat-passes-security-audit-with-flying-colors/
> )
> > after two years of hard work, restructuring and redesigning the whole
> thing,
> > and getting alternatively beaten up and helped by experts in the field.—
> > This would have *never* happened had we not been open source from the
> > beginning.
>
> Sure. Again, I believe that open source is a beneficial license for
> security, but we have to keep in mind that it's a means to an end --
> secure code -- and that it's not the only means. I think you were
> beaten up unfairly under the circumstances for cryptocat 1, and I
> similarly think we're beating up Silent Circle unfairly.
>
> >
> > Being open source is a painful but necessary process. It invites
> criticism,
> > bone-breaking and having to admit bad design, apologize for your mistakes
> > and work hard on fixing them. But only through that process you create
> > something great that benefits the security community by offering
> > opportunities to learn. Sure, Silent Circle started off as a good
> product,
> > but by being closed-source they disregard the proper practice of what
> makes
> > this industry progress in terms of engineering, and they cast a shadow of
> > uncertainty and closed progress upon themselves, too.
> >
>
> There are just so many aspects that go into software licensing that I
> just don't draw that same line. If the goal is secure code, I again
> think the key is having an adequate number of capable people analyzing
> and dissecting that code on a constant basis. That can mean closed
> source code audits, and it can mean having a full time security team
> analyzing and improving the code at all times (Google, Facebook, many
> others) regardless of the software license. Open source is awesome,
> and I believe in it wholeheartedly, but I don't think if an
> organization doesn't open source their code they're automatically
> crazy and kicked out of the club.
>
> -a
> --
> Unsubscribe, change to digest, or change password at:
> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>



-- 



Brian Conley

Director, Small World News

http://smallworldnews.tv

m: 646.285.2046

Skype: brianjoelconley
--
Unsubscribe, change to digest, or change password at: 
https://mailman.stanford.edu/mailman/listinfo/liberationtech

Reply via email to