Sorry for the late response on this thread.
At first, I think xassert in lisp_data_to_selection_data
(xselect.c) is wrong. Here, obj is generated by a Lisp code
that may generate a multibyte string by error (as in the
current case). But, in general, an error in Lisp code
should not lead to abort. So, I propose this change. I
choose string_make_unibyte instead of string_as_unibyte to
avoid exporting Emacs' internal leading bytes.
*** xselect.c 12 Feb 2005 09:54:46 +0900 1.148
--- xselect.c 12 Feb 2005 10:39:47 +0900
***************
*** 1908,1914 ****
}
else if (STRINGP (obj))
{
! xassert (! STRING_MULTIBYTE (obj));
if (NILP (type))
type = QSTRING;
*format_ret = 8;
--- 1908,1915 ----
}
else if (STRINGP (obj))
{
! if (STRING_MULTIBYTE (obj))
! obj = string_make_unibyte (obj);
if (NILP (type))
type = QSTRING;
*format_ret = 8;
In article <[EMAIL PROTECTED]>, "Jan D." <[EMAIL PROTECTED]> writes:
> ELISP> (setq str (string-to-multibyte <1025 ASCII character string>))
> ...
> ELISP> (multibyte-string-p str)
> t
> ELISP> (multibyte-string-p (encode-coding-string str
>
> 'compound-text-with-extensions))
> t <---- BUG, should be nil
> ELISP> (multibyte-string-p (encode-coding-string str 'utf-8))
> nil
Thank you for finding this bug. As encode-coding-string
should be regarded as an interface between multibyte and
unibyte, it should always return an unibyte string. But,
NOCOPY argument in encode-coding-string and
encode_coding_string is to avoid unnecessary string
allocation if STR is ASCII-only. So, in such a case, I'm
going to change that function to modify STR to unibyte
directly (i.e. by calling STRING_SET_UNIBYTE) instead of
calling Fstring_as_unibyte.
What do you think?
---
Ken'ichi HANDA
[EMAIL PROTECTED]
_______________________________________________
Emacs-devel mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/emacs-devel