Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-29 Thread Mick
On Thursday 29 Dec 2011 04:46:26 you wrote:
  From: owner-openssl-us...@openssl.org On Behalf Of Mick
  Sent: Monday, 26 December, 2011 14:01
 
 snip: CA-vs-EE DN string types
 
  I seem to have overcome the original problem.  Now both the
  cacert and signed
  client certificates are formatted in the same way.  I used -policy
  policy_anything to avoid complaints from openssl ca.
  
  Unfortunately the problem of authenticating on the VPN
  gateway remains.  :-(
  
  I would be grateful for some advice, as I am not sure if I am
  following the
  correct steps.  I have created a request for a client certificate:
  
  ==
  
   openssl req -config ./openssl_VPN.cnf -new -newkey rsa:2048 -keyout
  
  VPN_test_key.pem -days 1095 -out VPN_test_cert.req
  ==
 
 Aside: for req -new without -x509, -days is ignored and useless.

Ah!  Thanks I didn't know this.  I thought that the CLI options prevail in any 
case.

  ==
  openssl ca -config ./openssl_VPN.cnf -extensions usr_cert
  -days 1095 -cert
  cacert_VPN.pem -keyfile VPN_CA/private/cakey_VPN.pem -policy
  policy_anything -
  infiles VPN_test_cert.req
 
 snip
 
  However, trying to verify it brings up some errors:
  ==
  openssl verify -verbose -CAfile cacert_VPN.pem -x509_strict
  -policy_print -
  issuer_checks VPN_test_cert.pem
  VPN_test_cert.pem: C = GB, O = Sundial, CN = VPN_test_XPS
  error 29 at 0 depth lookup:subject issuer mismatch
  C = GB, O = Sundial, CN = VPN_test_XPS
  error 29 at 0 depth lookup:subject issuer mismatch
  C = GB, O = Sundial, CN = VPN_test_XPS
  error 29 at 0 depth lookup:subject issuer mismatch
  OK
  ==
 
 -issuer_checks can be misleading; these errors
 are the results of internal tests for a root cert
 (i.e. issued by itself) and thus quite normal.
 Since the final result is OK, OpenSSL is happy.

OK, its good to know that openssl is happy.


  and the asn1parser fails too:
  ==
  openssl asn1parse -in VPN_test_cert.pem
  Error in encoding
  139747192850088:error:0D07207B:asn1 encoding
  routines:ASN1_get_object:header
  too long:asn1_lib.c:150:
  ==
 
 Make sure you asn1parse a file/input containing ONLY
 valid data (here dashed-BEGIN, b64 cert, dashed-END).
 All(?) other openssl PEM functions accept and ignore
 comments or garbage before BEGIN or after END, but
 not asn1parse. And some openssl functions including ca
 PUT such comments. You can avoid editing a copy by:
   awk '/-BEGIN/,/-END/' filewithextra | openssl asn1parse
 on any *nix, and on Windows if you add an awk port.

Just tried this and all certs are parsed correctly.  So clearly the router is 
not chocking on an encoding error.


  The cacert does not suffer from such verification or parsing
  errors, but certificates signed by it, do.
 
  The errors that the router authentication shows are:
 snip
 
 But as far as pleasing your router, I have no clue, sorry.

Thank you Dave.  Your comments have been most informative.  I have raised a 
support request with the router manufacturer and wait to see what they come up 
with.
-- 
Regards,
Mick
__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-29 Thread Mick
On Thursday 29 Dec 2011 10:03:01 Mick wrote:
 On Thursday 29 Dec 2011 04:46:26 you wrote:

  PUT such comments. You can avoid editing a copy by:
awk '/-BEGIN/,/-END/' filewithextra | openssl asn1parse
  
  on any *nix, and on Windows if you add an awk port.
 
 Just tried this and all certs are parsed correctly.  So clearly the router
 is not chocking on an encoding error.
 
   The cacert does not suffer from such verification or parsing
   errors, but certificates signed by it, do.
  
   The errors that the router authentication shows are:
  snip
  
  But as far as pleasing your router, I have no clue, sorry.
 
 Thank you Dave.  Your comments have been most informative.  I have raised a
 support request with the router manufacturer and wait to see what they come
 up with.

Just an idea ... Could it be that the router is expecting some explicit 
keyUsage: extensions on the cacert?  If so what should I try?
-- 
Regards,
Mick
__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


RE: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-28 Thread Dave Thompson
 From: owner-openssl-us...@openssl.org On Behalf Of Mick
 Sent: Monday, 26 December, 2011 14:01

snip: CA-vs-EE DN string types

 I seem to have overcome the original problem.  Now both the 
 cacert and signed 
 client certificates are formatted in the same way.  I used -policy 
 policy_anything to avoid complaints from openssl ca.
 
 Unfortunately the problem of authenticating on the VPN 
 gateway remains.  :-(
 
 I would be grateful for some advice, as I am not sure if I am 
 following the 
 correct steps.  I have created a request for a client certificate:
 
 ==
  openssl req -config ./openssl_VPN.cnf -new -newkey rsa:2048 -keyout 
 VPN_test_key.pem -days 1095 -out VPN_test_cert.req
 ==
 
Aside: for req -new without -x509, -days is ignored and useless.
 
 Then signed it with the cacert:
 
Nits: it isn't actually the request that's signed and the CI 
isn't actually signed with the cert, but we know what you mean.

 ==
 openssl ca -config ./openssl_VPN.cnf -extensions usr_cert 
 -days 1095 -cert 
 cacert_VPN.pem -keyfile VPN_CA/private/cakey_VPN.pem -policy 
 policy_anything -
 infiles VPN_test_cert.req 
snip

 However, trying to verify it brings up some errors:
 ==
 openssl verify -verbose -CAfile cacert_VPN.pem -x509_strict 
 -policy_print -
 issuer_checks VPN_test_cert.pem 
 VPN_test_cert.pem: C = GB, O = Sundial, CN = VPN_test_XPS
 error 29 at 0 depth lookup:subject issuer mismatch
 C = GB, O = Sundial, CN = VPN_test_XPS
 error 29 at 0 depth lookup:subject issuer mismatch
 C = GB, O = Sundial, CN = VPN_test_XPS
 error 29 at 0 depth lookup:subject issuer mismatch
 OK
 ==
 
-issuer_checks can be misleading; these errors 
are the results of internal tests for a root cert 
(i.e. issued by itself) and thus quite normal.
Since the final result is OK, OpenSSL is happy.

 
 and the asn1parser fails too:
 ==
 openssl asn1parse -in VPN_test_cert.pem 
 Error in encoding
 139747192850088:error:0D07207B:asn1 encoding 
 routines:ASN1_get_object:header 
 too long:asn1_lib.c:150:
 ==
 
Make sure you asn1parse a file/input containing ONLY 
valid data (here dashed-BEGIN, b64 cert, dashed-END).
All(?) other openssl PEM functions accept and ignore 
comments or garbage before BEGIN or after END, but 
not asn1parse. And some openssl functions including ca 
PUT such comments. You can avoid editing a copy by:
  awk '/-BEGIN/,/-END/' filewithextra | openssl asn1parse 
on any *nix, and on Windows if you add an awk port.

 The cacert does not suffer from such verification or parsing 
 errors, but 
 certificates signed by it, do.
 
 The errors that the router authentication shows are:
snip

But as far as pleasing your router, I have no clue, sorry.


__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-26 Thread Mick
On Friday 16 Dec 2011 18:31:01 you wrote:
 Le 16/12/2011 18:45, Mick a écrit :

  Since I cannot change the router firmware, what should I change the
  'string_mask =  ' on the PC to agree with the router?
 
 My understanding is that string_mask is used when producing an object
 (request or certificate), not when checking its content with the policy
 match directives.
 
 You could either regenerate your CA with string_mask set to default
 (which means: first try PrintableString, then T61String, then
 BMPString). Your router uses PrintableString for pretty much anything
 except commonName, which is encoded in T61String. That could work.

I seem to have overcome the original problem.  Now both the cacert and signed 
client certificates are formatted in the same way.  I used -policy 
policy_anything to avoid complaints from openssl ca.

Unfortunately the problem of authenticating on the VPN gateway remains.  :-(

I would be grateful for some advice, as I am not sure if I am following the 
correct steps.  I have created a request for a client certificate:

==
 openssl req -config ./openssl_VPN.cnf -new -newkey rsa:2048 -keyout 
VPN_test_key.pem -days 1095 -out VPN_test_cert.req
==


Then signed it with the cacert:

==
openssl ca -config ./openssl_VPN.cnf -extensions usr_cert -days 1095 -cert 
cacert_VPN.pem -keyfile VPN_CA/private/cakey_VPN.pem -policy policy_anything -
infiles VPN_test_cert.req 
Using configuration from ./openssl_VPN.cnf
Enter pass phrase for VPN_CA/private/cakey_VPN.pem:
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 3 (0x3)
Validity
Not Before: Dec 26 18:13:18 2011 GMT
Not After : Dec 25 18:13:18 2014 GMT
Subject:
countryName   = GB
organizationName  = Sundial
commonName= VPN_test_XPS
X509v3 extensions:
X509v3 Basic Constraints: 
CA:FALSE
X509v3 Subject Key Identifier: 
E6:95:82:48:3D:E8:3D:9E:0C:BA:CF:3A:EC:FF:8D:0D:E0:6A:1B:2B
X509v3 Authority Key Identifier: 

keyid:CA:91:0A:ED:F9:B5:F4:F7:60:C5:A7:1C:0B:75:94:5C:33:38:F1:AB

X509v3 Key Usage: 
Digital Signature, Non Repudiation, Key Encipherment
Certificate is to be certified until Dec 25 18:13:18 2014 GMT (1095 days)
Sign the certificate? [y/n]:y


1 out of 1 certificate requests certified, commit? [y/n]y
Write out database with 1 new entries
Certificate:
Data:
Version: 3 (0x2)
Serial Number: 3 (0x3)
Signature Algorithm: sha1WithRSAEncryption
Issuer: C=GB, O=Sundial, CN=Sundial_VPN_CA
Validity
Not Before: Dec 26 18:13:18 2011 GMT
Not After : Dec 25 18:13:18 2014 GMT
Subject: C=GB, O=Sundial, CN=VPN_test_XPS
Subject Public Key Info:
Public Key Algorithm: rsaEncryption
Public-Key: (2048 bit)
Modulus:
00:df:d4:74:bc:de:21:bd:61:99:4c:88:97:0a:43:
3f:c0:40:01:73:b8:41:ce:47:46:fd:14:0f:83:6d:
75:54:bc:73:45:f2:99:24:1e:51:f1:d9:b6:8f:9b:
bf:e5:e5:93:00:79:a8:56:38:04:e2:06:69:5a:1e:
29:16:72:25:5e:bb:1a:2d:e0:82:90:b2:46:78:b5:
8d:e7:ce:6a:f7:9e:f4:6a:30:4e:da:db:09:17:ba:
78:d0:03:c5:22:ad:1d:73:61:82:81:ce:d1:15:1a:
dd:66:76:22:d0:4f:a6:23:13:f1:b7:d0:67:57:28:
e7:bb:25:87:57:04:c6:c3:4f:f1:56:c1:b4:12:05:
7d:3a:9c:14:88:5e:8c:df:49:08:69:2c:00:8a:db:
d6:20:e5:f6:4d:66:38:a3:c9:f5:9d:f4:b8:24:03:
11:67:75:3c:c7:f1:75:35:dc:86:9f:e9:98:04:c7:
ba:8f:64:b8:58:64:49:27:e4:c1:25:0f:00:4e:ad:
7c:14:3b:38:1b:4d:fc:47:de:d4:a4:48:1c:81:89:
20:f5:8e:ad:2b:e2:91:51:c1:db:b3:8f:86:17:fc:
61:49:4e:03:b1:8d:97:2d:06:b4:10:51:20:78:9e:
c2:3d:5f:a8:83:a3:8e:6b:39:64:2a:ac:7a:f7:4e:
31:11
Exponent: 65537 (0x10001)
X509v3 extensions:
X509v3 Basic Constraints: 
CA:FALSE
X509v3 Subject Key Identifier: 
E6:95:82:48:3D:E8:3D:9E:0C:BA:CF:3A:EC:FF:8D:0D:E0:6A:1B:2B
X509v3 Authority Key Identifier: 

keyid:CA:91:0A:ED:F9:B5:F4:F7:60:C5:A7:1C:0B:75:94:5C:33:38:F1:AB

X509v3 Key Usage: 
Digital Signature, Non Repudiation, Key Encipherment
Signature Algorithm: sha1WithRSAEncryption
55:3b:d7:52:91:70:a2:ec:8f:ff:db:ca:1b:8c:b5:73:34:10:
e8:18:3f:4f:5a:f5:75:88:99:86:a6:e8:3d:1b:2d:8c:d2:ae:

Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-18 Thread Mick
On Friday 16 Dec 2011 18:31:01 you wrote:
 Le 16/12/2011 18:45, Mick a écrit :
 [...]

  Since I cannot change the router firmware, what should I change the
  'string_mask =  ' on the PC to agree with the router?
 
 My understanding is that string_mask is used when producing an object
 (request or certificate), not when checking its content with the policy
 match directives.

That's fine as far as openssl usage is concerned, but when the standalone 
router compares the client certificate submitted to it, it fails with a 
malformed type error (16).  So, I'm led to believe that I should try creating 
a CA that uses a default string_mask to align that with the way the router 
parses the RDNs and sign both router and client certificates with it.


 You could either regenerate your CA with string_mask set to default
 (which means: first try PrintableString, then T61String, then
 BMPString). Your router uses PrintableString for pretty much anything
 except commonName, which is encoded in T61String. That could work.

Perhaps I am being dense ... but I can't find which section I should be 
specifying this option under, in the openssl.cnf file.  I tried placing it 
under [ req ] as well as other sections and the produced cacert Subject fields 
always get encoded it in utf8 (except for Country which stays as 
PrintableString)  :(

-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-18 Thread Mick
On Sunday 18 Dec 2011 18:10:55 Mick wrote:
 On Friday 16 Dec 2011 18:31:01 you wrote:
  Le 16/12/2011 18:45, Mick a écrit :
  [...]
  
   Since I cannot change the router firmware, what should I change the
   'string_mask =  ' on the PC to agree with the router?
  
  My understanding is that string_mask is used when producing an object
  (request or certificate), not when checking its content with the policy
  match directives.
 
 That's fine as far as openssl usage is concerned, but when the standalone
 router compares the client certificate submitted to it, it fails with a
 malformed type error (16).  So, I'm led to believe that I should try
 creating a CA that uses a default string_mask to align that with the way
 the router parses the RDNs and sign both router and client certificates
 with it.
 
  You could either regenerate your CA with string_mask set to default
  (which means: first try PrintableString, then T61String, then
  BMPString). Your router uses PrintableString for pretty much anything
  except commonName, which is encoded in T61String. That could work.
 
 Perhaps I am being dense ... but I can't find which section I should be
 specifying this option under, in the openssl.cnf file.  I tried placing it
 under [ req ] as well as other sections and the produced cacert Subject
 fields always get encoded it in utf8 (except for Country which stays as
 PrintableString)  :(

Oops!  Scratch that!  I had forgotten to point it to the correct openssl.cnf 
file!  O_O

OK, I'm almost there ... the only difference now between the router and my PKI 
is the commonName.  The router has T61String while my cacert comes out as 
PrintableString.  How can I change a single RDN?
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-18 Thread Mick
On Monday 19 Dec 2011 06:45:13 Mick wrote:
 On Sunday 18 Dec 2011 18:10:55 Mick wrote:
  On Friday 16 Dec 2011 18:31:01 you wrote:
   Le 16/12/2011 18:45, Mick a écrit :
   [...]
   
Since I cannot change the router firmware, what should I change the
'string_mask =  ' on the PC to agree with the router?
   
   My understanding is that string_mask is used when producing an object
   (request or certificate), not when checking its content with the policy
   match directives.
  
  That's fine as far as openssl usage is concerned, but when the standalone
  router compares the client certificate submitted to it, it fails with a
  malformed type error (16).  So, I'm led to believe that I should try
  creating a CA that uses a default string_mask to align that with the way
  the router parses the RDNs and sign both router and client certificates
  with it.
  
   You could either regenerate your CA with string_mask set to default
   (which means: first try PrintableString, then T61String, then
   BMPString). Your router uses PrintableString for pretty much anything
   except commonName, which is encoded in T61String. That could work.
  
  Perhaps I am being dense ... but I can't find which section I should be
  specifying this option under, in the openssl.cnf file.  I tried placing
  it under [ req ] as well as other sections and the produced cacert
  Subject fields always get encoded it in utf8 (except for Country which
  stays as PrintableString)  :(
 
 Oops!  Scratch that!  I had forgotten to point it to the correct
 openssl.cnf file!  O_O
 
 OK, I'm almost there ... the only difference now between the router and my
 PKI is the commonName.  The router has T61String while my cacert comes out
 as PrintableString.  How can I change a single RDN?

Aha!  Just tried signing the CSR and the commonName is actually created as a 
T61String!

Thank you very much for your help and sorry for the noise.  :-)
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 15:07, Jakob Bohm a écrit :

I think we may have a bug here, anyone from the core team
wish to comment on this.

The apparent bug:

When enforcing the match policy for a DN part, openssl reports an
error if the CSR has used a different string type for the field, but the
correct value (The naively expected behavior is to realize the strings
are identical and use the configured encoding for the resulting cert).


Do you expect the openssl ca tool to apply the complete X.520 
comparison rules before checking the policy?



3. Validating a certificate whose issuing CA certificate specifies path
constraints where the issued certificate satisfies the path constraint
except for the exact choice of string type.


NameConstraints is a set of constraints imposed on the semantic value of 
the name elements, not on their encoding (string type, double-spacing, 
case differences, etc).




Technical note:  All the defined string types have a well defined
mapping to and from 32 bit Unicode code points, with the following
one-way limitations:

   BMPStrings can only represent U+ to U+10
  (using UTF-16)

   UTF8Strings can only represent U+ to U+7FFF
  (allowing the possibility that some codepoints above U+10
   may be assigned in the future, contrary to current policy).
  (OpenSSL may or may not accept the CESU-8 and Java
   Modified UTF-8 encoding variants and may or may not normalize
   those to real UTF-8 before further processing).

   PrintableString can only represent a specific small set of Unicode
  code points

   T61String can only represent a different specific small set.


T.61 has no well defined bidirectional mapping with UTF8.
That said, T.61 was withdrawn before 1993 (IIRC) and shouldn't be used.

--
Erwann ABALEA
-
yétiscopique: relatif à certaines vapeurs des sommets himalayens

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Jakob Bohm

On 12/16/2011 3:22 PM, Erwann Abalea wrote:

Le 16/12/2011 15:07, Jakob Bohm a écrit :

I think we may have a bug here, anyone from the core team
wish to comment on this.

The apparent bug:

When enforcing the match policy for a DN part, openssl reports an
error if the CSR has used a different string type for the field, but the
correct value (The naively expected behavior is to realize the strings
are identical and use the configured encoding for the resulting cert).


Do you expect the openssl ca tool to apply the complete X.520 
comparison rules before checking the policy?

Not unless there are OpenSSL functions to do the work.

Otherwise I just expect it to apply the character set conversions it 
uses for its

other operations (such as reading the config file/displaying DNs).




3. Validating a certificate whose issuing CA certificate specifies path
constraints where the issued certificate satisfies the path constraint
except for the exact choice of string type.


NameConstraints is a set of constraints imposed on the semantic value 
of the name elements, not on their encoding (string type, 
double-spacing, case differences, etc).

The question was how the OpenSSL code (library and command line) handle
the scenario, your answer seems to indicate that it is indeed supposed to
compare the semantic character sequence, not the encoding.




Technical note:  All the defined string types have a well defined
mapping to and from 32 bit Unicode code points, with the following
one-way limitations:

   BMPStrings can only represent U+ to U+10
  (using UTF-16)

   UTF8Strings can only represent U+ to U+7FFF
  (allowing the possibility that some codepoints above U+10
   may be assigned in the future, contrary to current policy).
  (OpenSSL may or may not accept the CESU-8 and Java
   Modified UTF-8 encoding variants and may or may not normalize
   those to real UTF-8 before further processing).

   PrintableString can only represent a specific small set of Unicode
  code points

   T61String can only represent a different specific small set.


T.61 has no well defined bidirectional mapping with UTF8.
That said, T.61 was withdrawn before 1993 (IIRC) and shouldn't be used.


According to RFC1345, T.61 has a well defined mapping to named
characters also found in UNICODE.  Some of those are encoded
as two bytes in T.61 (using a modifier+base char scheme), the
rest as one byte.  That is what I mean by a bidirectional mapping
to a small (sub)set of UNICODE, meaning that most UNICODE
code points cannot be mapped to T.61, but the rest have a
bidirectional mapping.

According to the same source, 7-bit T.61 appears to be a proper
subset of 8-bit T.61.

Constructing a mapping table from the data in RFC1345 or other
sources is left as an exercise for the reader (cheat hint: Maybe
IBM included such a table in ICU or unicode.org included one in
its data files).

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

man req
Then look for the -utf8 argument.

I took your example below, added -utf8 argument, and it worked.
You can display the content with openssl req -text -noout -in 
blabla.pem -nameopt multiline,utf8,-esc_msb


Le 16/12/2011 16:33, Lou Picciano a écrit :


openssl req -new -sha1 -nodes \
-nameopt multiline,show_type \
-keyout private/THORSTROM.key \
-out csrs/THORSTROM.csr \
-subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom Group/CN=Áki Thörstrom


--
Erwann ABALEA
-
vésicosufflochromateur: supérieur à 0,5 gramme



Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Mick
On Friday 16 Dec 2011 16:23:52 you wrote:
 man req
 Then look for the -utf8 argument.
 
 I took your example below, added -utf8 argument, and it worked.
 You can display the content with openssl req -text -noout -in
 blabla.pem -nameopt multiline,utf8,-esc_msb

Would using -utf8 resolve the original OP problem?  

-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 17:57, Mick a écrit :

On Friday 16 Dec 2011 16:23:52 you wrote:

man req
Then look for the -utf8 argument.

I took your example below, added -utf8 argument, and it worked.
You can display the content with openssl req -text -noout -in
blabla.pem -nameopt multiline,utf8,-esc_msb

Would using -utf8 resolve the original OP problem?


To create the request/certificate, yes.
This is what I do to embed accented characters in UTF8.

Typing

openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key -out 
THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom Group/CN=Áki 
Thörstrom


on an UTF8 capable terminal, with a string_mask = utf8only in the 
right openssl.cnf file, gives me a certificate request correctly encoded 
in UTF8 with the wanted characters in the DN.


--
Erwann ABALEA
-
minilactopotage: intense satisfaction



Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Jakob Bohm

On 12/16/2011 6:14 PM, Erwann Abalea wrote:

Le 16/12/2011 17:57, Mick a écrit :

On Friday 16 Dec 2011 16:23:52 you wrote:

man req
Then look for the -utf8 argument.

I took your example below, added -utf8 argument, and it worked.
You can display the content with openssl req -text -noout -in
blabla.pem -nameopt multiline,utf8,-esc_msb

Would using -utf8 resolve the original OP problem?


To create the request/certificate, yes.
This is what I do to embed accented characters in UTF8.

Typing

openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key 
-out THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom 
Group/CN=Áki Thörstrom


on an UTF8 capable terminal, with a string_mask = utf8only in the 
right openssl.cnf file, gives me a certificate request correctly 
encoded in UTF8 with the wanted characters in the DN.

Sorry, but OP's problem seems to be that the CSR was created by some
software embedded in a router, which presumably would not allow him
to set OpenSSL command line options, OpenSSL config file options or
even the terminal type, even if the software in the router happened to
be built around OpenSSL.

OPs problem is that the OpenSSL ca command is being overly strict in
its handling of policy constraints on DN name components, rejecting
alternative encodings of the same name with a meaningless error
message (foo does not match foo) rather than accept those.

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Mick
On Friday 16 Dec 2011 17:27:42 you wrote:
 On 12/16/2011 6:14 PM, Erwann Abalea wrote:
  Le 16/12/2011 17:57, Mick a écrit :
  On Friday 16 Dec 2011 16:23:52 you wrote:
  man req
  Then look for the -utf8 argument.
  
  I took your example below, added -utf8 argument, and it worked.
  You can display the content with openssl req -text -noout -in
  blabla.pem -nameopt multiline,utf8,-esc_msb
  
  Would using -utf8 resolve the original OP problem?
  
  To create the request/certificate, yes.
  This is what I do to embed accented characters in UTF8.
  
  Typing
  
  openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key
  -out THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom
  Group/CN=Áki Thörstrom
  
  on an UTF8 capable terminal, with a string_mask = utf8only in the
  right openssl.cnf file, gives me a certificate request correctly
  encoded in UTF8 with the wanted characters in the DN.
 
 Sorry, but OP's problem seems to be that the CSR was created by some
 software embedded in a router, which presumably would not allow him
 to set OpenSSL command line options, OpenSSL config file options or
 even the terminal type, even if the software in the router happened to
 be built around OpenSSL.
 
 OPs problem is that the OpenSSL ca command is being overly strict in
 its handling of policy constraints on DN name components, rejecting
 alternative encodings of the same name with a meaningless error
 message (foo does not match foo) rather than accept those.

Indeed, the message was rather esoteric and it did not offer a way out - e.g. 
it could have advised to change match to supplied in openssl.cnf, or to 
ensure that the encoding between the CSR and ca is the same.

I think what confused me is that by uploading the cacert to the router I would 
expect the router to respect the cacert's encodings.  It evidently did not.

Since I cannot change the router firmware, what should I change the 
'string_mask =  ' on the PC to agree with the router?
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 16:29, Jakob Bohm a écrit :

On 12/16/2011 3:22 PM, Erwann Abalea wrote:

Le 16/12/2011 15:07, Jakob Bohm a écrit :

I think we may have a bug here, anyone from the core team
wish to comment on this.

The apparent bug:

When enforcing the match policy for a DN part, openssl reports an
error if the CSR has used a different string type for the field, but 
the

correct value (The naively expected behavior is to realize the strings
are identical and use the configured encoding for the resulting cert).


Do you expect the openssl ca tool to apply the complete X.520 
comparison rules before checking the policy?

Not unless there are OpenSSL functions to do the work.

Otherwise I just expect it to apply the character set conversions it 
uses for its

other operations (such as reading the config file/displaying DNs).


Fair.
I personally use the openssl command line tools to have a quick CA, not 
a full-featured one. The API is complete and allows to code this.

But you're right, it would be fair to have consistent behaviour.


3. Validating a certificate whose issuing CA certificate specifies path
constraints where the issued certificate satisfies the path constraint
except for the exact choice of string type.


NameConstraints is a set of constraints imposed on the semantic value 
of the name elements, not on their encoding (string type, 
double-spacing, case differences, etc).

The question was how the OpenSSL code (library and command line) handle
the scenario, your answer seems to indicate that it is indeed supposed to
compare the semantic character sequence, not the encoding.


That's what X.509 and X.520 impose. An algorithm is described in X.520 
for name comparisons.



T.61 has no well defined bidirectional mapping with UTF8.
That said, T.61 was withdrawn before 1993 (IIRC) and shouldn't be used.


According to RFC1345, T.61 has a well defined mapping to named
characters also found in UNICODE.  Some of those are encoded
as two bytes in T.61 (using a modifier+base char scheme), the
rest as one byte.  That is what I mean by a bidirectional mapping
to a small (sub)set of UNICODE, meaning that most UNICODE
code points cannot be mapped to T.61, but the rest have a
bidirectional mapping.


I'm not finished with the reading of T.61 (1988 edition), but here's 
what I found:
 - 0xA6 is the '#' character, 0xA8 is the '¤' character (generic 
currency), but those characters can also be obtained with 0x23 and 0x24, 
respectively (Figure 2, note 4). Later in the same document, 0x23 and 
0x24 are declared as not used. This is both ambiguous and not 
bidirectional.

 - 0x7F and 0xFF are not defined, and are not defined as not used.
 - 0xC9 was the umlaut diacritical mark in the 1980 edition, which is 
still tolerated in the 1988 edition, but the tables don't clearly define 
0xC9 (and again, don't define it as not used). 0xC8 is declared as 
diaresis or umlaut mark. As I don't have the 1980 edition, I don't 
know if it was already the case.
 - nothing is said if a diacritical mark is encoded without a base 
character.


These are ambiguities.

Annexes define control sequences (longer that 2 bytes), graphical 
characters, configurable character sets, presentation functions 
(selection of page format, character sizes and attributes 
(bold/italic/underline), line settings (vertical and horizontal 
spacing)). I doubt everything can be mapped to UTF8.



Constructing a mapping table from the data in RFC1345 or other
sources is left as an exercise for the reader (cheat hint: Maybe
IBM included such a table in ICU or unicode.org included one in
its data files).


I think only a subset of T.61 is taken into consideration. But I haven't 
looked at the hinted files.


--
Erwann ABALEA
-
D'abord, on est sur le web, pas sur ce usenet dont on nous rabbache les
oreilles et qui n'est qu'une abstraction.
-+- JP in http://neuneu.ctw.cc - Neuneu en abstract mode -+-

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Lou Picciano
OK, Jakob - will try this. Tks for the feedback. (Seems we'd tried the 'utf8' 
option inline already, but will try again). and my 'read' of the -nameopt 
multiline config was that utf8 would be included, in absence of its specific 
de-activation, such as with the -utf8 command. 

Lou Picciano 

- Original Message -
From: Jakob Bohm jb-open...@wisemo.com 
To: openssl-users@openssl.org 
Sent: Friday, December 16, 2011 12:27:42 PM 
Subject: Re: [openssl-users] Re: stateOrProvinceName field problem when signing 
CSR 

On 12/16/2011 6:14 PM, Erwann Abalea wrote: 
 Le 16/12/2011 17:57, Mick a écrit : 
 On Friday 16 Dec 2011 16:23:52 you wrote: 
 man req 
 Then look for the -utf8 argument. 
 
 I took your example below, added -utf8 argument, and it worked. 
 You can display the content with openssl req -text -noout -in 
 blabla.pem -nameopt multiline,utf8,-esc_msb 
 Would using -utf8 resolve the original OP problem? 
 
 To create the request/certificate, yes. 
 This is what I do to embed accented characters in UTF8. 
 
 Typing 
 
 openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key 
 -out THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom 
 Group/CN=Áki Thörstrom 
 
 on an UTF8 capable terminal, with a string_mask = utf8only in the 
 right openssl.cnf file, gives me a certificate request correctly 
 encoded in UTF8 with the wanted characters in the DN. 
Sorry, but OP's problem seems to be that the CSR was created by some 
software embedded in a router, which presumably would not allow him 
to set OpenSSL command line options, OpenSSL config file options or 
even the terminal type, even if the software in the router happened to 
be built around OpenSSL. 

OPs problem is that the OpenSSL ca command is being overly strict in 
its handling of policy constraints on DN name components, rejecting 
alternative encodings of the same name with a meaningless error 
message (foo does not match foo) rather than accept those. 

__ 
OpenSSL Project http://www.openssl.org 
User Support Mailing List openssl-users@openssl.org 
Automated List Manager majord...@openssl.org 


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 18:27, Jakob Bohm a écrit :

On 12/16/2011 6:14 PM, Erwann Abalea wrote:

Le 16/12/2011 17:57, Mick a écrit :

On Friday 16 Dec 2011 16:23:52 you wrote:

man req
Then look for the -utf8 argument.

I took your example below, added -utf8 argument, and it worked.
You can display the content with openssl req -text -noout -in
blabla.pem -nameopt multiline,utf8,-esc_msb

Would using -utf8 resolve the original OP problem?


To create the request/certificate, yes.
This is what I do to embed accented characters in UTF8.

Typing

openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key 
-out THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom 
Group/CN=Áki Thörstrom


on an UTF8 capable terminal, with a string_mask = utf8only in the 
right openssl.cnf file, gives me a certificate request correctly 
encoded in UTF8 with the wanted characters in the DN.

Sorry, but OP's problem seems to be that the CSR was created by some
software embedded in a router, 


Sorry, I replied to the problem described by Lou Picciano, and forgot 
that Mick was the OP. My fault.


--
Erwann ABALEA
-
Le netétiquette n'est qu'une vaste fumisterie,il faut de l'argent pour
fonctionner,à force,en France de refuser tout rapport sain avec
l'argent,l'on riqsque de tuer ce nouvel outil.
-+- AA in: Guide du Neuneu d'Usenet - Le netétiquette du riche -+-

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Jakob Bohm

On 12/16/2011 6:47 PM, Erwann Abalea wrote:

Le 16/12/2011 16:29, Jakob Bohm a écrit :

On 12/16/2011 3:22 PM, Erwann Abalea wrote:

Le 16/12/2011 15:07, Jakob Bohm a écrit :

I think we may have a bug here, anyone from the core team
wish to comment on this.

The apparent bug:

When enforcing the match policy for a DN part, openssl reports an
error if the CSR has used a different string type for the field, 
but the

correct value (The naively expected behavior is to realize the strings
are identical and use the configured encoding for the resulting cert).


Do you expect the openssl ca tool to apply the complete X.520 
comparison rules before checking the policy?

Not unless there are OpenSSL functions to do the work.

Otherwise I just expect it to apply the character set conversions it 
uses for its

other operations (such as reading the config file/displaying DNs).


Fair.
I personally use the openssl command line tools to have a quick CA, 
not a full-featured one. The API is complete and allows to code this.

But you're right, it would be fair to have consistent behaviour.

3. Validating a certificate whose issuing CA certificate specifies 
path

constraints where the issued certificate satisfies the path constraint
except for the exact choice of string type.


NameConstraints is a set of constraints imposed on the semantic 
value of the name elements, not on their encoding (string type, 
double-spacing, case differences, etc).

The question was how the OpenSSL code (library and command line) handle
the scenario, your answer seems to indicate that it is indeed 
supposed to

compare the semantic character sequence, not the encoding.


That's what X.509 and X.520 impose. An algorithm is described in X.520 
for name comparisons.

I understand, but does OpenSSL implement that?



T.61 has no well defined bidirectional mapping with UTF8.
That said, T.61 was withdrawn before 1993 (IIRC) and shouldn't be used.


According to RFC1345, T.61 has a well defined mapping to named
characters also found in UNICODE.  Some of those are encoded
as two bytes in T.61 (using a modifier+base char scheme), the
rest as one byte.  That is what I mean by a bidirectional mapping
to a small (sub)set of UNICODE, meaning that most UNICODE
code points cannot be mapped to T.61, but the rest have a
bidirectional mapping.


I'm not finished with the reading of T.61 (1988 edition), but here's 
what I found:
 - 0xA6 is the '#' character, 0xA8 is the '¤' character (generic 
currency), but those characters can also be obtained with 0x23 and 
0x24, respectively (Figure 2, note 4). Later in the same document, 
0x23 and 0x24 are declared as not used. This is both ambiguous and 
not bidirectional.

As you quote it (I don't have a copy), this sounds like using 0x23
and 0x24 should not be done when encoding, but should be accepted
when decoding.

 - 0x7F and 0xFF are not defined, and are not defined as not used.

RFC1345 seems to indicate that 0x7F maps to U+007F DEL
 - 0xC9 was the umlaut diacritical mark in the 1980 edition, which is 
still tolerated in the 1988 edition, but the tables don't clearly 
define 0xC9 (and again, don't define it as not used). 0xC8 is 
declared as diaresis or umlaut mark. As I don't have the 1980 
edition, I don't know if it was already the case.
 - nothing is said if a diacritical mark is encoded without a base 
character.

RFC1346 seems to indicate that certain diacritical marks must always be
followed by a base character (which may be 0x20 space), the others never.
This is consistent with the mechanical behavior of mechanical teletypes
and typewriters: Diacritics are implemented as overtyping dead keys
that place the diacritic on the paper but do not advance the print head,
thus causing the next character to be combined with it.


These are ambiguities.

Annexes define control sequences (longer that 2 bytes), graphical 
characters, configurable character sets, presentation functions 
(selection of page format, character sizes and attributes 
(bold/italic/underline), line settings (vertical and horizontal 
spacing)). I doubt everything can be mapped to UTF8.
Most of those would be inapplicable to the encoding of X.500 strings, 
configurable
character sets sounds like an ISO-2022 like mechanism useful for 
encoding an even

larger subset of UNICODE, as do graphical characters.

However none of those features were mentioned in the still available 
secondary

references I looked at (RFC1345 and Wikipedia), so they are unlikely to be
accepted nor emitted by any current implementations of T.61.



Constructing a mapping table from the data in RFC1345 or other
sources is left as an exercise for the reader (cheat hint: Maybe
IBM included such a table in ICU or unicode.org included one in
its data files).


I think only a subset of T.61 is taken into consideration. But I 
haven't looked at the hinted files.



Sounds like it.  RFC1345 is a historic listing of character sets encountered
on the European part of the early 

Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 18:45, Mick a écrit :
[...]

Indeed, the message was rather esoteric and it did not offer a way out - e.g.
it could have advised to change match to supplied in openssl.cnf, or to
ensure that the encoding between the CSR and ca is the same.

I think what confused me is that by uploading the cacert to the router I would
expect the router to respect the cacert's encodings.  It evidently did not.


It doesn't need to :)


Since I cannot change the router firmware, what should I change the
'string_mask =  ' on the PC to agree with the router?


My understanding is that string_mask is used when producing an object 
(request or certificate), not when checking its content with the policy 
match directives.


You could either regenerate your CA with string_mask set to default 
(which means: first try PrintableString, then T61String, then 
BMPString). Your router uses PrintableString for pretty much anything 
except commonName, which is encoded in T61String. That could work.


Or you could keep your CA certificate as is, change the policy 
directives (from match to supplied), and manually check the requests.


Or code something with openssl req -text -nameopt 
multiline,utf8,-esc_msb ..., extracting the RDNs, comparing with what 
is set in the CA certificate (the -nameopt ... argument will convert 
everything into UTF8, easing the comparison), whence performing your own 
validation.


--
Erwann ABALEA
-

 Désolé.

Ta gueule.
-+- LC in : Guide du Neuneu Usenet - Neuneu exaspère le dino -+-

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Erwann Abalea

Le 16/12/2011 19:07, Jakob Bohm a écrit :

On 12/16/2011 6:47 PM, Erwann Abalea wrote:

Le 16/12/2011 16:29, Jakob Bohm a écrit :

On 12/16/2011 3:22 PM, Erwann Abalea wrote:
NameConstraints is a set of constraints imposed on the semantic 
value of the name elements, not on their encoding (string type, 
double-spacing, case differences, etc).

The question was how the OpenSSL code (library and command line) handle
the scenario, your answer seems to indicate that it is indeed 
supposed to

compare the semantic character sequence, not the encoding.


That's what X.509 and X.520 impose. An algorithm is described in 
X.520 for name comparisons.

I understand, but does OpenSSL implement that?


In the API, yes. At least in 1.0.0 branch, which passes the NIST PKITS 
suite.


I'm not finished with the reading of T.61 (1988 edition), but here's 
what I found:
 - 0xA6 is the '#' character, 0xA8 is the '¤' character (generic 
currency), but those characters can also be obtained with 0x23 and 
0x24, respectively (Figure 2, note 4). Later in the same document, 
0x23 and 0x24 are declared as not used. This is both ambiguous and 
not bidirectional.

As you quote it (I don't have a copy), this sounds like using 0x23
and 0x24 should not be done when encoding, but should be accepted
when decoding.


Yes, and that also means those characters cannot be obtained with 7-bit 
T.61, contrary to the table found in RFC1345.
In fact, there's no 7-bit T.61 set, so I don't really know how RFC1345 
should be treated.



 - 0x7F and 0xFF are not defined, and are not defined as not used.

RFC1345 seems to indicate that 0x7F maps to U+007F DEL


This mapping (0x7F - DEL) is only presented in Annex E, discussing the 
greek primary character set. But the Table 2, which exhaustively lists 
the codes, avoids 0x7F (07/15, really).


Some PKI toolkits use T.61 to encode ISO8859-1 characters, and ISO8859-1 
defines 0x7F as DEL.


Annexes define control sequences (longer that 2 bytes), graphical 
characters, configurable character sets, presentation functions 
(selection of page format, character sizes and attributes 
(bold/italic/underline), line settings (vertical and horizontal 
spacing)). I doubt everything can be mapped to UTF8.
Most of those would be inapplicable to the encoding of X.500 strings, 
configurable
character sets sounds like an ISO-2022 like mechanism useful for 
encoding an even

larger subset of UNICODE, as do graphical characters.

However none of those features were mentioned in the still available 
secondary
references I looked at (RFC1345 and Wikipedia), so they are unlikely 
to be

accepted nor emitted by any current implementations of T.61.


Sure. But those are valid T.61 sequences anyway.

As you said, RFC1345 lists historic character sets, and T.61 is one of 
them (if predates Unicode).
T.61 was ambiguous, was defined for a now obsolete system (Teletex), was 
more than a simple character set (you could redefine graphical 
characters, and specify formatting), and is now withdrawn since nearly 2 
decades. It's time to avoid it :)


--
Erwann ABALEA
-
préhibernoluthidolichospasmes: sanglots longs des violons de l'automne, 
phénomène météomusical aux propriétés homéoanémicardiomutilatoires, décrit pour 
la première fois par Verlaine en 1866

__
OpenSSL Project http://www.openssl.org
User Support Mailing Listopenssl-users@openssl.org
Automated List Manager   majord...@openssl.org


Re: [openssl-users] Re: stateOrProvinceName field problem when signing CSR

2011-12-16 Thread Lou Picciano
Yes, and Thank You both for doing so! 

While we're at it, I am reminded of another one we've found - not terribly 
important, but worth a look: 

In using this option: '-enddate 140615235959Z' when signing a CSR, the cert is 
created correctly, expiring in 2014. However, the user prompt indicates it 
expires in '365 days' - in fact, I've never seen it prompt with any number 
larger than 365 days! 

Not a huge problem, but... 

Lou Picciano 

- Original Message -
From: Erwann Abalea erwann.aba...@keynectis.com 
To: openssl-users@openssl.org 
Cc: Jakob Bohm jb-open...@wisemo.com 
Sent: Friday, December 16, 2011 1:04:49 PM 
Subject: Re: [openssl-users] Re: stateOrProvinceName field problem when signing 
CSR 

Le 16/12/2011 18:27, Jakob Bohm a écrit : 
 On 12/16/2011 6:14 PM, Erwann Abalea wrote: 
 Le 16/12/2011 17:57, Mick a écrit : 
 On Friday 16 Dec 2011 16:23:52 you wrote: 
 man req 
 Then look for the -utf8 argument. 
 
 I took your example below, added -utf8 argument, and it worked. 
 You can display the content with openssl req -text -noout -in 
 blabla.pem -nameopt multiline,utf8,-esc_msb 
 Would using -utf8 resolve the original OP problem? 
 
 To create the request/certificate, yes. 
 This is what I do to embed accented characters in UTF8. 
 
 Typing 
 
 openssl req -utf8 -new -nodes -newkey rsa:512 -keyout THORSTROM.key 
 -out THORSTROM.csr -subj /O=ESBJÖRN.com/OU=Esbjörn-Thörstrom 
 Group/CN=Áki Thörstrom 
 
 on an UTF8 capable terminal, with a string_mask = utf8only in the 
 right openssl.cnf file, gives me a certificate request correctly 
 encoded in UTF8 with the wanted characters in the DN. 
 Sorry, but OP's problem seems to be that the CSR was created by some 
 software embedded in a router, 

Sorry, I replied to the problem described by Lou Picciano, and forgot 
that Mick was the OP. My fault. 

-- 
Erwann ABALEA 
- 
Le netétiquette n'est qu'une vaste fumisterie,il faut de l'argent pour 
fonctionner,à force,en France de refuser tout rapport sain avec 
l'argent,l'on riqsque de tuer ce nouvel outil. 
-+- AA in: Guide du Neuneu d'Usenet - Le netétiquette du riche -+- 

__ 
OpenSSL Project http://www.openssl.org 
User Support Mailing List openssl-users@openssl.org 
Automated List Manager majord...@openssl.org