> >                                As a matter of fact I don't see any
> > strong reason to compile 64-bit applications except when I have to
> > address matrices larger than 2GB.
> 
> Occasionally one has to handle 2GB of data, and it's hardly possible
> to know in advance which applications will be used on such amounts of
> data ...
Application developer has to think about it in either case. Once he
thinks, he's perfectly capable of choosing appropriate OpenSSL option
too. Those who don't think should get "defaulted" to whatever cc with no
options defaults to, namely 32-bit model.
> On Solaris 2.6, where 64bit-files can be handled by extra
> functions (open64 etc.), most (all?) free tools fail because they do
> not know about those functions;
It's not tools' fault, but developers:-)
> and surely it does not make for a nice
> API.
'man lfcompile' and see that it's in principal possible to use
"conventional" API even in 32-bit environment. Well, of course I keep my
fingers crossed because it works as long as developers stick to off_t,
size_t and similar, not to int or long decalrations.
>  Using 64 bits consistently would save many troubles;
Many, but not all. It saves troubles as long as developer sticks to at
least long. In either case following is going to break under both
whatever lfcompile manual pages tells to do and -xarch=v9:

int pos=lseek(fd,0,SEEK_CUR);
...
lseek(fd,pos,SEEK_SET);

And the worst of all, compiler doesn't even emit a warning! Damn!!!
> but if
> there's a mixture anyway, of course you have to decide on a
> case-by-case basis.
As I said. It's safer to default to whatever [g]cc without options
defaults to.

Andy.
______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
Development Mailing List                       [EMAIL PROTECTED]
Automated List Manager                           [EMAIL PROTECTED]

Reply via email to