Am Sonntag, den 14.09.2008, 23:32 +0900 schrieb Dmitry Timoshkov: > > ok(ret, "LCMapStringW must succeed\n"); > > ret2 = LCMapStringW(LOCALE_USER_DEFAULT, LCMAP_SORTKEY, > > - symbols_stripped, -1, buf2, sizeof(buf2)); > > + symbols_stripped, -1, buf2, > > sizeof(buf2)/sizeof(WCHAR)); > > ok(ret2, "LCMapStringW must succeed\n"); > > ok(ret == ret2, "lengths of sort keys must be equal\n"); > > ok(!lstrcmpA(p_buf, p_buf2), "sort keys must be equal\n"); > LCMAP_SORTKEY takes the target buffer size in bytes in both A and W versions.
Do you have any references for this claim? Wine doesn't implement it, MSDN mentions only characters and there is no Wine API test to prove your statement. > > - ret = LCMapStringW(LOCALE_USER_DEFAULT, 0, upper_case, 0, buf, > > sizeof(buf)); > > + ret = LCMapStringW(LOCALE_USER_DEFAULT, 0, upper_case, 0, buf, > > sizeof(buf)/sizeof(WCHAR)); > > ok(!ret, "LCMapStringW should fail with srclen = 0\n"); > The size of the target buffer doesn't matter at all in this case, since > the API is supposed to fail due to source length being 0. Even if the size doesn't matter, this line should get fixed, as the Wine tests are a kind of of Win32 API reference by example. IMHO you shouldn't include such misleading parameters as the size in the wrong unit into API usage examples. Regards, Michael Karcher