I agree that having the XSTR() macro would be useful way to attempt to
maximize performance without much conscious effort required. I'll see if we
can get this added in before the 1.1.0 release. It will have to be done in
the per-compiler file, each of which will have to define it one way or the
other.

But, if you really need a Unicode string, regardless of platform, you'll
still have to do it the very painful way I do it inside the parser, like:

    const XMLCh aString[] = { chLatin_A, chLatin_B, chLatin_C, chNull };

VERY painful, but required to have portable Unicode strings.

And platforms such as HP, which don't create Unicode code points when they
see a L"" type constant, will have to define it away even though they do
have an L prefix, since that won't create the desired results. They will
want to take the transcoding hit.

----------------------------------------
Dean Roddey
Software Weenie
IBM Center for Java Technology - Silicon Valley
[EMAIL PROTECTED]



"Arnold, Curt" <[EMAIL PROTECTED]> on 01/13/2000 03:52:01 PM

Please respond to [EMAIL PROTECTED]

To:   "'[EMAIL PROTECTED]'" <[EMAIL PROTECTED]>
cc:
Subject:  RE: DOMString (was: xalan-c Problem with Xerces initialization)



The DOMString(int) constructor is used in at least AttrImpl::getValue().

I agree it should be changed, I'd recommend replacing its functionality
with
a reserve(int) method ala std::basic_string.reserve().


-----------

I also ran into some curious code in this routine.

DOMString AttrImpl::toString()
{
    DOMString retString;

    retString.appendData(name);
    retString.appendData(DOMString("=\""));
    retString.appendData(getValue());
    retString.appendData(DOMString("\""));
    return retString;
}

It would seem to require two unnecessary DOMString constructors and
transcoding.  There was a discussion about adding a _XSTR macro, on most
platforms defined as

#define _XSTR(str) L ## str

to whereever XMLCh is defined, wouldn't this be much better as

DOMString AttrImpl::toString()
{
    DOMString retString;

    retString.appendData(name);
    retString.appendData(_XSTR("=\""));
    retString.appendData(getValue());
    retString.appendData(_XSTR("\""));
    return retString;
}

It is just a curious piece of code and if there is a specific reason it was
done this way, I'd like to be enlightened.

---------------

I've just started trying trying to identify issues to morphing DOMString
into std::basic_string semantics.  I'm running into a little wacky behavior
in the VC6 IDE build that maybe someone will recognize off the top of their
head.  I'm using the 1_0_1 drop, debug build, on VC 6 SP2.  If I add a file
to the XercesLib subproject (test.cpp) that is simply

#include <string>
std::basic_string<char> test;

And build in the IDE, I get a slew a error messages starting with

d:\msdev6\vc98\include\utility(81) : error C2146: syntax error : missing
';'
before identifier 'iterator_category'
        d:\msdev6\vc98\include\utility(84) : see reference to class
template
instantiation 'std::iterator_traits<_It>' being compiled

If I try to mimic the same configuration on cl, I get no compile error
problem.  And I can build the file in a new project.  My guess is there is
some compiler switch that is needed for STL, that is switched off, but I'm
baffled.




Reply via email to