Imbriale, Donald (Exchange) wrote:
I think you're confusing the DBCS value of the NSYMBOL option with the
DBCS option.
Well, it certainly is confusing. But I tried to make it
clear what I was saying is choosing the NATIONAL value
for the NSYMBOL option forces on the DBCS option. And
it still doesn't make any sense. Of course, it probably
is not a good practice to have standalone options the
same as choices for other options.
Kind regards,
-Steve Comstock
Don Imbriale
[snip]
NSYMBOL(National) *requires* (forces on) DBCS, so actually
having/allowing
the DBCS option is a "pre-requisite" for having Unicode support.
Ah. Now that is just flat out wrong. The doc says it is
NSYMBOL({NATIONAL|DBCS}) - that is, one or the other.
Ahh, but wait. Same doc under "Conflicting Compiler Options",
it says NSYMBOL(NATIONAL) forces on the DBCS compiler option.
Now I'm really confused. Why would you set up a choice of
NSYMBOL({NATIONAL|DBCS}) when setting NATIONAL forces on DBCS?
Very nice.
There are some long and "painful" internal discussions (between
myself and
the IBM ANSI COBOL rep) and within the J4 group about exactly what is
"Standard conforming" behavior when you have "control characters"
within an
alphanumeric literal. I won't go into them here, but I
semi-understand the
IBM position that ALLOWING "national" character strings within an
alphanumeric literal is a "good thing" when you MAY use X"0E" type
notation
*if* you want to have those x'0d' and x'0e' within literals.
The change in defaults WAS highlighted in announcements, migration
guides,
and installation material - but what its IMPLICATIONS were - are
probably
unclear to most programmers (application or systems).
Yup.
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html