[Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
It's been said (e.g. [1]) that hashing passwords with two rounds of
MD5 is basically a waste of time these days, because brute-forcing
even relatively long passwords is now feasible with cheap hardware.
Indeed, you can buy software [2] which claims to be able to check 90
million MediaWiki passwords per second on an ordinary GPU. That would
let you crack a random 8-letter password in 20 minutes.

So the time has probably come for us to come up with a C type
password hashing scheme, to replace the B-type hashes that we use at
the moment. I've been thinking along the lines of the following goals:

1. Future-proof: should be adaptable to faster hardware.
2. Upgradeable: it should be possible to compute the C-type hash from
the B-type hash, to allow upgrades without bothering users.
3. Efficient in PHP, with default configure options.
4. MediaWiki-specific, so that generic software can't be used to crack
our hashes.

The problem with the standard key strengthening algorithms, e.g.
PBKDF1, is that they are not efficient in PHP. We don't want a C
implementation of our scheme to be orders of magnitude faster than our
PHP implementation, because that would allow brute-forcing to be more
feasible than is necessary.

The idea I came up with is to hash the output of str_repeat(). This
increases the number of rounds of the compression function, while
avoiding tight loops in PHP code.

PHP's hash extension has been available by default since PHP 5.1.2,
and we can always fall back to using B-type hashes if it's explicitly
disabled. The WHIRLPOOL hash is supported. It has no patent or
copyright restrictions so it's not going to be yanked out of Debian or
PHP for legal reasons. It has a 512-bit block size, the largest of any
hash function available in PHP, and its security goals state that it
can be truncated without compromising its properties.

My proposed hash function is a B-type MD5 salted hash, which is then
further hashed with a configurable number of invocations of WHIRLPOOL,
with a 256-bit substring taken from a MediaWiki-specific location. The
input to each WHIRLPOOL operation is expanded by a factor of 100 with
str_repeat().

The number of WHIRLPOOL iterations is specified in the output string
as a base-2 logarithm (whimsically padded out to 3 decimal digits to
allow for future universe-sized computers). This number can be
upgraded by taking the hash part of the output and applying more
rounds to it. A count of 2^7 = 128 gives a time of 55ms on my laptop,
and 12ms on one of our servers, so a reasonable default is probably
2^6 or 2^7.

Demo code: http://p.defau.lt/?udYa5CYhHFrgk4SBFiTpGA

Typical output:
:C:007:187aabf399e25aa1:9441ccffe8f1afd8c277f4d914ce03c6fcfe157457596709d846ff832022b037

-- Tim Starling

[1] http://www.theregister.co.uk/2010/08/16/password_security_analysis/

[2] http://www.insidepro.com/eng/egb.shtml


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Daniel Kinzler
Tim Starling schrieb:
 It's been said (e.g. [1]) that hashing passwords with two rounds of
 MD5 is basically a waste of time these days, because brute-forcing
 even relatively long passwords is now feasible with cheap hardware.
 Indeed, you can buy software [2] which claims to be able to check 90
 million MediaWiki passwords per second on an ordinary GPU. That would
 let you crack a random 8-letter password in 20 minutes.

I don't know that much about the mathematical details of hashing, but i'd like
to drop a pointer to an article if found interesting in this context:

Stop using unsafe keyed hashes, use HMAC
http://rdist.root.org/2009/10/29/stop-using-unsafe-keyed-hashes-use-hmac/

So, how does your proposal relate to HMAC?

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Robert Rohde
On Wed, Aug 18, 2010 at 11:37 PM, Tim Starling tstarl...@wikimedia.org wrote:
snip

 The idea I came up with is to hash the output of str_repeat(). This
 increases the number of rounds of the compression function, while
 avoiding tight loops in PHP code.
snip
 My proposed hash function is a B-type MD5 salted hash, which is then
 further hashed with a configurable number of invocations of WHIRLPOOL,
 with a 256-bit substring taken from a MediaWiki-specific location. The
 input to each WHIRLPOOL operation is expanded by a factor of 100 with
 str_repeat().

snip

Let me preface my comment by saying that I haven't studied WHIRLPOOL,
and the following may not apply to it at all.

However, it is known that some block cypher based hashes behave poorly
when fed repeated copies of the same block.  In the worst cases the
hash space is substantially truncated from its full size (which
probably is not the case for any serious cryptographic hash function).
 Under less severe cases, cryptanalysis can find a new block cipher W'
such that N applications of block cipher W is the same as one
application of W'.  If WHIRLPOOL is vulnerable to that kind of attack
then it would negate the effect of using str_repeat in your code.

Like I said, I don't know if either concern applies to WHIRLPOOL.
However, these concerns only occur because the 256-bit string you are
repeating is a fundamental divisor of the 512-bit block size used by
WHIRLPOOL.  So, it is trivial to avoid the whole issue simply by using
a different repeated block size.  For example 97 copies of a 33 byte
string should have essentially the same computational performance,
while making any associated cryptanalysis threat impossible (or at
least less likely).


My only other comment is something you presumably already know.  Your
proposal is still nothing but an arms race.  It makes hashes harder to
crack by making the hash function itself much more computationally
expensive.  However, you'd still have to periodically boost the rep
rate with the intention of staying far in front of the hackers.

As a complementary approach it would be nice if there was something in
Mediawiki to aid in the selection of strong passwords.  Regardless of
hash function, it will still take about two billion times longer to
find one 10 character password in [A-Za-z0-9] as it does to find a 6
character password in [a-z].  Even if password strength testing
algorithms were disabled on Wikipedia sites, it would still be a nice
addition to have in the Mediawiki codebase in general.

-Robert Rohde

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
On 19/08/10 18:45, Daniel Kinzler wrote:
 Tim Starling schrieb:
 It's been said (e.g. [1]) that hashing passwords with two rounds of
 MD5 is basically a waste of time these days, because brute-forcing
 even relatively long passwords is now feasible with cheap hardware.
 Indeed, you can buy software [2] which claims to be able to check 90
 million MediaWiki passwords per second on an ordinary GPU. That would
 let you crack a random 8-letter password in 20 minutes.
 
 I don't know that much about the mathematical details of hashing, but i'd like
 to drop a pointer to an article if found interesting in this context:
 
 Stop using unsafe keyed hashes, use HMAC
 http://rdist.root.org/2009/10/29/stop-using-unsafe-keyed-hashes-use-hmac/
 
 So, how does your proposal relate to HMAC?

HMAC is for secret keys, there's no secret key in this scheme.

That article mentions collision and second-preimage attacks. As far as
I can determine, neither is relevant to a password hashing scheme.

Say if you knew someone's password. Then a second-preimage attack
would allow you to construct a new, longer password which also allowed
you to log in as them. This would be a waste of time though, since you
could have just logged in with the original password.

Similarly, nobody really cares if you can construct two long
passwords, set one in your preferences, and use the other to log in.
That's all a collision lets you do.

The security goals for password hashing are quite different to those
for message authentication, and less well-studied. Key strengthening
algorithms use hashing as a proof of work, so a break would be an
optimised algorithm. Usually, the designers of hash functions consider
optimised algorithms to be useful, not a break.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
On 19/08/10 19:02, Robert Rohde wrote:
 Let me preface my comment by saying that I haven't studied WHIRLPOOL,
 and the following may not apply to it at all.
 
 However, it is known that some block cypher based hashes behave poorly
 when fed repeated copies of the same block.  In the worst cases the
 hash space is substantially truncated from its full size (which
 probably is not the case for any serious cryptographic hash function).
  Under less severe cases, cryptanalysis can find a new block cipher W'
 such that N applications of block cipher W is the same as one
 application of W'.  If WHIRLPOOL is vulnerable to that kind of attack
 then it would negate the effect of using str_repeat in your code.
 
 Like I said, I don't know if either concern applies to WHIRLPOOL.
 However, these concerns only occur because the 256-bit string you are
 repeating is a fundamental divisor of the 512-bit block size used by
 WHIRLPOOL.  So, it is trivial to avoid the whole issue simply by using
 a different repeated block size.  For example 97 copies of a 33 byte
 string should have essentially the same computational performance,
 while making any associated cryptanalysis threat impossible (or at
 least less likely).

I think it's unlikely that any structure would appear with repeated
input with any modern cryptographic hash function. But the solution
you suggest is simple enough, so we may as well add that.

 My only other comment is something you presumably already know.  Your
 proposal is still nothing but an arms race.  It makes hashes harder to
 crack by making the hash function itself much more computationally
 expensive.  However, you'd still have to periodically boost the rep
 rate with the intention of staying far in front of the hackers.

Yes, I mentioned that I included the ability to add more iterations
without knowing the original password. That's why the hash has to be
truncated to the output size at each iteration, instead of using the
full hash as an internal state and truncating once before output.

 As a complementary approach it would be nice if there was something in
 Mediawiki to aid in the selection of strong passwords.  Regardless of
 hash function, it will still take about two billion times longer to
 find one 10 character password in [A-Za-z0-9] as it does to find a 6
 character password in [a-z].  Even if password strength testing
 algorithms were disabled on Wikipedia sites, it would still be a nice
 addition to have in the Mediawiki codebase in general.

I believe a JavaScript password strength meter was recently added to
the core.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] 访问我的Netlog个人主页

2010-08-19 Thread 杨杰
嗨,

我创建了一个Netlog个人主页,其中包括我的图片、视频、博客和活动。非常希望邀请你成为我的朋友,一起共享我们的天地。这需要你先在Netlog上注册哦!在登录后,你也可以创建属于自己的个人主页了。

看一看:
http://zh.netlog.com/go/mailurl/-bT0xNTQ2OTM4NDcmbD0xJmdtPTM3JnU9JTJGZ28lMkZyZWdpc3RlciUyRmlkJTNEMTAzMzY3OTI3NiUyNmklM0R0OTE_

祝好,
杨杰


如果你不想再接接收到来自朋友的任何邀请,则请
http://zh.netlog.com/go/mailurl/-bT0xNTQ2OTM4NDcmbD0yJmdtPTM3JnU9JTJGZ28lMkZub21haWxzJTJGaW52aXRlJTJGZW1haWwlM0QtZDJscmFYUmxZMmd0YkVCc2FYTjBjeTUzYVd0cGJXVmthV0V1YjNKbiUyNmNvZGUlM0QwMjUwMzQ4NCUyNmlkJTNEMTAzMzY3OTI3NiUyNmklM0R0OTI_
Don't want to receive invitations from your friends anymore?
http://zh.netlog.com/go/mailurl/-bT0xNTQ2OTM4NDcmbD0zJmdtPTM3JnU9aHR0cCUzQSUyRiUyRmVuLm5ldGxvZy5jb20lMkZnbyUyRm5vbWFpbHMlMkZpbnZpdGUlMkZlbWFpbCUzRC1kMmxyYVhSbFkyZ3RiRUJzYVhOMGN5NTNhV3RwYldWa2FXRXViM0puJTI2Y29kZSUzRDAyNTAzNDg0JTI2aWQlM0QxMDMzNjc5Mjc2JTI2aSUzRHQ5Mg__
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Jonathan Leybovich
  Tim Starling wrote:
 
 So the time has probably come for us to come up with a C type
 password hashing scheme, to replace the B-type hashes that we use at
 the moment. 

What about using public key cryptography?  Generate a key-pair and use the 
public key to produce your password hashes. Store the private key offline in 
an underground vault just in case someday you'll need to recover the original 
passwords in order to rehash them.  Needless to say the key-pair must be 
entirely for internal use and not already part of some PKI system (i.e. the 
basis for one of Wikimedia's signed SSL certificates).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
On 20/08/10 00:12, Jonathan Leybovich wrote:
 Tim Starling wrote:
 
 So the time has probably come for us to come up with a C type 
 password hashing scheme, to replace the B-type hashes that we use
 at the moment.
 
 What about using public key cryptography?  Generate a key-pair and
 use the public key to produce your password hashes. Store the
 private key offline in an underground vault just in case someday
 you'll need to recover the original passwords in order to rehash
 them.  Needless to say the key-pair must be entirely for internal
 use and not already part of some PKI system (i.e. the basis for one
 of Wikimedia's signed SSL certificates).

You don't need to store the original passwords in a recoverable form
in order to rehash them. You can just apply extra hashing to the old
hash. This is how the A-B transition worked, and it's how the B-C
transition should work too, unless someone knows of some kind of
cryptographic problem with it. It's a convenient method because it
saves the cost of underground vaults, with no loss in security.

-- Tim Starling



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Ryan Lane
 http://newsarse.com/2010/08/13/if-you-can-remember-your-password-then-its-hopelessly-inadequate-warn-researchers/

 Passwords suck, and people are a problem. Now, if we could distribute
 RSA fobs to every editor ...


We could do a less secure, but more-secure-than-passwords alternative,
which is to use email or SMS as a one time password device. SMS is
obviously more secure than email, but would require us to ask people
for their phone numbers. We could also make a PKI infrastructure, and
allow certificate login, which is obviously safer than passwords.

The real problem with any system stronger than passwords, is that it
requires a level of complexity that would be difficult for us, and
either annoying or very confusing for users.

Respectfully,

Ryan Lane

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Demo for XMPP-Based RC-Notifications

2010-08-19 Thread Artur Fijałkowski
2010/8/19 Daniel Kinzler dan...@brightbyte.de:

 2) extra channels that include full text, diffs, etc? UDP is a limiting factor
 here. Alternative transport from PHP to the bridge process?

Named pipes? Of course only if PHP can keep named pipe open in
persistent mode. I'm not sure if this is easy or even possible at all
;)

AJF/WarX

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Jonathan Leybovich
Tim Starling wrote:
 You don't need to store the original passwords in a recoverable form
 in order to rehash them. You can just apply extra hashing to the old
 hash. This is how the A-B transition worked, and it's how the B-C
 transition should work too, unless someone knows of some kind of
 cryptographic problem with it. It's a convenient method because it
 saves the cost of underground vaults, with no loss in security.

In that case you could always discard the private portion of the key-pair to 
produce a strictly one-way function.  And at least with this scheme you 
always 
do have the option
of moving to 'C' regardless of whether it can accept the end-products of B as 
inputs.  Plus I would wager that asymmetric ciphers will stand up to attacks 
far 
longer than most hashing functions.



  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Vector skin failures on mobile phones - any timeframe for a fix?

2010-08-19 Thread Mark A. Hershberger
K. Peachey p858sn...@yahoo.com.au writes:

 I think you guys are experiencing the java-script load issues

Why is the mobile redirect left to Javascript?  Wouldn't it be better
for all concerned if the redirect happened before any PHP was loaded?

Wouldn't it be better for those older phones with little memory if the
redirect happened in PHP?

Why is JS even involved in this?

http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/WikimediaMobile/MobileRedirect.js?view=markup

Mark.

-- 
http://hexmode.com/

Embrace Ignorance.  Just don't get too attached.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Testing Framework

2010-08-19 Thread Mark A. Hershberger
Trevor Parscal tpars...@wikimedia.org writes:

 I don't know where this landed, but I wanted to point out that system 
 testing might be a better name for out use of Selenium, Acceptance 
 testing has more of a customer is accepting a product connotation.

During our discussion last Friday, Marcus, Priyanka and I decided to
stick with the name of the framework (in this case, “Selenium”) since
there could be other tools used for similar sorts of testing.

In other words, if we named the directory “system” and then had system
tests performed with a tool other than Selenium, we'd be back to square
one — that is “Where do we put the tests and What do we name them?”

Mark.

-- 
http://hexmode.com/

Embrace Ignorance.  Just don't get too attached.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Demo for XMPP-Based RC-Notifications

2010-08-19 Thread Daniel Kinzler
Artur Fijałkowski schrieb:
 2010/8/19 Daniel Kinzler dan...@brightbyte.de:
 
 2) extra channels that include full text, diffs, etc? UDP is a limiting 
 factor
 here. Alternative transport from PHP to the bridge process?
 
 Named pipes? Of course only if PHP can keep named pipe open in
 persistent mode. I'm not sure if this is easy or even possible at all
 ;)

Never done much with these, but woudn't they get confused if multiple php
processes accessed a named pipe at once? also, can a named pipe be connected to
a tcp stream? Hm, I guess the letter could simply by done with cat my.pipe |
socat tcp:...

Otoh, I don't think opening the pipe for every edit (not request!) would be
prohibitively expensive. Might be a possibility. Actually, XMLRC already
supports writing to a file, so no change to the extension would be needed at 
all.

-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Aryeh Gregor
On Thu, Aug 19, 2010 at 2:37 AM, Tim Starling tstarl...@wikimedia.org wrote:
 The problem with the standard key strengthening algorithms, e.g.
 PBKDF1, is that they are not efficient in PHP. We don't want a C
 implementation of our scheme to be orders of magnitude faster than our
 PHP implementation, because that would allow brute-forcing to be more
 feasible than is necessary.

 The idea I came up with is to hash the output of str_repeat(). This
 increases the number of rounds of the compression function, while
 avoiding tight loops in PHP code.

Seems reasonable.

 PHP's hash extension has been available by default since PHP 5.1.2,
 and we can always fall back to using B-type hashes if it's explicitly
 disabled. The WHIRLPOOL hash is supported. It has no patent or
 copyright restrictions so it's not going to be yanked out of Debian or
 PHP for legal reasons. It has a 512-bit block size, the largest of any
 hash function available in PHP, and its security goals state that it
 can be truncated without compromising its properties.

 My proposed hash function is a B-type MD5 salted hash, which is then
 further hashed with a configurable number of invocations of WHIRLPOOL,
 with a 256-bit substring taken from a MediaWiki-specific location. The
 input to each WHIRLPOOL operation is expanded by a factor of 100 with
 str_repeat().

 The number of WHIRLPOOL iterations is specified in the output string
 as a base-2 logarithm (whimsically padded out to 3 decimal digits to
 allow for future universe-sized computers). This number can be
 upgraded by taking the hash part of the output and applying more
 rounds to it. A count of 2^7 = 128 gives a time of 55ms on my laptop,
 and 12ms on one of our servers, so a reasonable default is probably
 2^6 or 2^7.

That seems reasonable.  It could probably be done a lot faster on GPUs, I guess.

On Thu, Aug 19, 2010 at 4:45 AM, Daniel Kinzler dan...@brightbyte.de wrote:
 I don't know that much about the mathematical details of hashing, but i'd like
 to drop a pointer to an article if found interesting in this context:

 Stop using unsafe keyed hashes, use HMAC
 http://rdist.root.org/2009/10/29/stop-using-unsafe-keyed-hashes-use-hmac/

 So, how does your proposal relate to HMAC?

As Tim said, it doesn't -- we aren't using keyed hashes, and we're
only concerned about preimage attacks (not collision or
second-preimage).  Preimage attacks imply second-preimage attacks, and
second-preimage attacks imply collision attacks.  Thus something
that's secure against collision is also secure against preimage and
second-preimage, but a function with collision and second-preimage
attacks might have no preimage attacks.  For instance, MD5 has tons of
trivial collision attacks against it, but no preimage attacks (not
sure about second-preimage attacks offhand).  Whirlpool has no known
collision attacks, and thus no known preimage or second-preimage
attacks.

On Thu, Aug 19, 2010 at 5:02 AM, Robert Rohde raro...@gmail.com wrote:
 Let me preface my comment by saying that I haven't studied WHIRLPOOL,
 and the following may not apply to it at all.

 However, it is known that some block cypher based hashes behave poorly
 when fed repeated copies of the same block.  In the worst cases the
 hash space is substantially truncated from its full size (which
 probably is not the case for any serious cryptographic hash function).
  Under less severe cases, cryptanalysis can find a new block cipher W'
 such that N applications of block cipher W is the same as one
 application of W'.  If WHIRLPOOL is vulnerable to that kind of attack
 then it would negate the effect of using str_repeat in your code.

In principle, we could evade any concern like this by just using a
provably secure hash function.  The usual reason not to use those is
that they're slow, but that's an advantage in our case.  For instance
(from an exercise in my cryptography course), let p be a prime, q a
prime dividing p - 1, and let G be the subgroup of Z_p^* of order q.
Let g be a randomly chosen generator for G, and let a_1, ..., a_k be
randomly chosen elements of G.  If x, y_1, ..., y_k are integers in
the range 1 to q, define H(x, y_1, ..., y_k) = g^x a_1^y_1 ...
a_k^y_k.  Then under the discrete logarithm assumption for G, it's
easy to prove that H is collision-resistant.  You can make similar
functions based on other hard problems --
http://en.wikipedia.org/wiki/Provably_secure_cryptographic_hash_function
gives another example using factorization, whose correctness is
probably more obvious.

There are two problems with using such a function.  One is that
there's probably no readily available implementation, so we'd have to
write our own, which might be vulnerable to side-channel attacks.
Another is that we're more worried about brute-forcing than about the
algorithm being broken, so we'd have to write it in C to avoid giving
one or two orders of magnitude advantage to the attacker, and then
shared hosts can't use it.  Perhaps the way to 

Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread soxred93

 On Thu, Aug 19, 2010 at 10:50 AM, Ryan Lane rlan...@gmail.com wrote:
 We could do a less secure, but more-secure-than-passwords  
 alternative,
 which is to use email or SMS as a one time password device. SMS is
 obviously more secure than email, but would require us to ask people
 for their phone numbers.

I don't do SMS, and I'm sure I'm not the only one who would rather  
not pay to get a password.

-X!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Aryeh Gregor
On Thu, Aug 19, 2010 at 5:16 PM, Lane, Ryan
ryan.l...@ocean.navo.navy.mil wrote:
 Though SMS has a number of vulnerabilties, as listed in the link, in
 practical terms, it is likely to be safer than email for one time passwords.
 Remember: one time passwords are used as a form of two factor
 authentication. The SMS is sent to something they have after the user enters
 the thing they know. The thing you know in this system is often a password.
 People tend to use the same password everywhere, and a user's wikipedia
 password is likely their email password, which would make a one time
 password sent to the email less effective.

 With SMS, an attacker would have to know the user's password, and would have
 to intercept the SMS, which isn't easy enough to be worth the trouble.

I don't get what you're suggesting here.  Every time a user logs in,
they need to both enter their password and follow a link in a text
message?  Do you think that more than a double-digit number of
Wikipedia users would actually opt into this?  Or are you talking
about support in MediaWiki to be used on high-security corporate or
government sites, not Wikipedia?

If someone is willing to do the work, I'm not objecting to supporting
this in the software, but if we're talking about Wikipedia, it doesn't
make sense to try going this far.

 But it is, if the private key is stored on a thumb drive (in a crypto
 application), or on a smart card. Even if the private key and password are
 stored on the filesystem unencrypted, an x509 key is safer than a password
 simply because it is *much* more complex, so it is very unlikely to be brute
 forced.

Yes, certificates are usually more secure than passwords.

 This is a pretty smug statement.

I don't think so.  I think it's completely reasonable, when talking
about Wikipedia.  Hackers go after money, and there's no money in
hacking Wikipedia.  We have nothing secret or valuable that's not
already readily available.  We have no black-market competitors who
want to try disrupting our service.  Any malicious action could be
easily reversed.  The worst we have to worry about is someone with a
grudge trying to frame someone else, which has happened, but it's
hardly a pressing concern.

I seriously cannot see more than a few dozen Wikimedia users actually
going to the effort of using one-time passwords over SMS just to
protect their account.  Consequently, it's not reasonable to ask
Wikimedia sysops to do the deployment work necessary for sending SMS
from Wikimedia servers, if it's nonzero.  Nor is it reasonable to have
the code reviewed, and the option offered (we already have too many
options), when there's so little benefit.

Every feature has an inherent cost in code maintenance and complexity
of use, so features that are too marginal should not be part of the
software.

 I think it would be nice to offer more secure methods of authentication to
 users who choose to take advantage of them. One time passwords would likely
 be too confusing to force on everyone, but they aren't too confusing to
 offer as an option. It also isn't very difficult to implement on the
 authentication server's end either. Also, if we are to act as an OpenID
 provider, it would be pretty nice to offer these more secure alternatives.

There is no point in providing options that virtually no one will use.
 It wastes the effort of all the people who have the maintain the
relevant code, and it's yet more distraction on our already
way-too-bloated preferences page.  And it will not be useful to anyone
when someone turns on the preference by mistake and can now no longer
log in because they gave a phone number that doesn't receive SMS, or
whatever.  When few enough people want a preference that more people
are likely to turn it on by mistake than deliberately, and when
there's significant harm or confusion from turning it on by mistake,
that's a sign that it's a bad preference.  (See also: Use external
editor.)

Do you think that more than 0.01% of Wikimedia users will enable any
such preference if provided?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Lane, Ryan
 There is no point in providing options that virtually no one will use.
  It wastes the effort of all the people who have the maintain the
 relevant code, and it's yet more distraction on our already
 way-too-bloated preferences page.  And it will not be useful to anyone
 when someone turns on the preference by mistake and can now no longer
 log in because they gave a phone number that doesn't receive SMS, or
 whatever.  When few enough people want a preference that more people
 are likely to turn it on by mistake than deliberately, and when
 there's significant harm or confusion from turning it on by mistake,
 that's a sign that it's a bad preference.  (See also: Use external
 editor.)
 
 Do you think that more than 0.01% of Wikimedia users will enable any
 such preference if provided?
 

World of Warcraft provides RSA cards to their users. People use them. I
think some of the same people that use the SSL secured login would also opt
to use a more secure method of authentication. I think this is especially
the case if we were an OpenID provider, and people used us for this service.

Either way, I'd likely be the person writing this support, and it would be
as an extension, or through another means that wouldn't require much effort.

Respectfully,

Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Lane, Ryan
 People are also going to keep thinking they're clever by using fuck
 as a password. Remember last time?
 
 http://davidgerard.co.uk/notes/2007/05/07/tubgirl-is-love/
 
 A better password algorithm will at least solve a part of the problem
 that's understood. Anyone who would choose to use SMS would, I
 suspect, have picked a good password in the first place. Can we do
 anything practical for people who can't remember passwords?
 

OpenID as a consumer somewhat helps with this problem, as people are more
likely to use more complex passwords if they have to remember fewer
passwords.

From a practical point of view, minus enforcing complexity rules, or at
least showing a password strength indicator and encouraging strong
passwords, there isn't much to do.

Respectfully,

Ryan Lane
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Aryeh Gregor
On Thu, Aug 19, 2010 at 5:44 PM, David Gerard dger...@gmail.com wrote:
 People are also going to keep thinking they're clever by using fuck
 as a password. Remember last time?

 http://davidgerard.co.uk/notes/2007/05/07/tubgirl-is-love/

Admins need to be forced to use secure passwords, using some standard
intelligent password checker.  (The default one on RHEL is excellent,
if memory serves.)  Nothing more than secure passwords is needed even
for admins, and regular users should not be encouraged to use
hard-to-remember passwords.  Maybe we could ban the very most common
passwords for regular users, at most.  It wasn't too long ago that we
allowed the empty string as a password.

On Thu, Aug 19, 2010 at 5:47 PM, Lane, Ryan
ryan.l...@ocean.navo.navy.mil wrote:
 World of Warcraft provides RSA cards to their users. People use them.

Because they have many thousands of dollars and man-hours invested in
their account.  Hackers who will try to guess their password and sell
the loot are a very credible and damaging threat.  Nothing comparable
is true of Wikipedia.  You have to tailor the security measures to the
real-world threats.

 Either way, I'd likely be the person writing this support, and it would be
 as an extension, or through another means that wouldn't require much effort.

I don't object to people writing whatever extensions interest them.
Personally, I'd be surprised if you'll get Wikimedia sysadmins
interested enough to turn it on, but that's not my decision.

This has strayed rather far from the original topic, though, so maybe
it should split to a separate thread if anyone is interested in
continuing.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
On 20/08/10 04:18, Jonathan Leybovich wrote:
 Plus I would wager that asymmetric ciphers will stand up to attacks far 
 longer than most hashing functions.

In a past life, I was a PhD student working on a broad military-funded
project which aimed to break all known asymmetric cryptography schemes
using large, expensive machines known as quantum computers. There will
come a point, maybe even this century, when large-block symmetric
ciphers like the WHIRLPOOL compression function will be the only sort
of security we will have left, unless you don't mind the government
being able to read all your messages.

Asymmetric ciphers are the only kind of widely-used cipher that have a
known vulnerability which allows cryptanalysis exponentially faster
than brute force, i.e. in polynomial time and space with respect to
the key length. So I think your faith is misplaced.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New password hashing proposal

2010-08-19 Thread Tim Starling
On 20/08/10 05:55, Aryeh Gregor wrote:
 On Thu, Aug 19, 2010 at 2:37 AM, Tim Starling tstarl...@wikimedia.org wrote:
 The number of WHIRLPOOL iterations is specified in the output string
 as a base-2 logarithm (whimsically padded out to 3 decimal digits to
 allow for future universe-sized computers). This number can be
 upgraded by taking the hash part of the output and applying more
 rounds to it. A count of 2^7 = 128 gives a time of 55ms on my laptop,
 and 12ms on one of our servers, so a reasonable default is probably
 2^6 or 2^7.
 
 That seems reasonable.  It could probably be done a lot faster on GPUs, I 
 guess.

Well, a GPU is fast because it is massively parallel, with hundreds of
cores. Each core is typically slower than a CPU. I chose a function
which is non-parallelisable, so you'd expect computation of a single
hash to be slower on a GPU than on a CPU. But a GPU can calculate
hundreds of them at a time.

My idle fantasy of choosing a parallelisable function and then using
GPUs to accelerate password hashing ended when I found out how much it
would cost to fit out the Wikimedia cluster with half a dozen Tesla
cards. I don't think the powers that be would be particularly
interested in spending that kind of money for a tiny improvement in
security.

[...]
 Another thing to consider is if we could pick a function that's
 particularly inconvenient to execute on GPUs.  Those are a great way
 for crackers to easily outdo any CPU implementation.

I think that would be a more useful way to go than provably secure
hash functions. The relevant Wikipedia article suggests using a
memory bound function, which is sensitive to memory access time, and
gets faster when more memory is available. Personally, I think it
would be interesting to attempt to construct a function which is
limited by branch prediction errors. They are said to be particularly
expensive for GPUs. They also get progressively more expensive for
more recent CPUs, which means that people with old hardware would have
access to more secure hashing than they would otherwise.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l