> I think this significantly impacts usability. For example, if I have  
> a function
>
>      def foo(char* x):
>          ...
>   
For this specific example, one could hypothetically do something like

def foo(utf8charbuf x):
   ...

(or more generally encodedcharbuf("utf-8"), but one could typedef). The 
caller shouldn't notice.

I.e., such a new type would then automatically coerce unicode <-> char* 
using the encoding indicated by the type. Behaves exactly like char* in 
every situation, except that it can be assigned to/from a Python unicode 
object and knows what to do (compile-time). One could even be able to 
define external functions using such a type, and it would be understood 
that the external function took a char buffer.

I believe most usecases could be made practical and easy in this manner; 
while keeping what goes on very explicit. But I see that a) being able 
to use the name "char*" will be much more friendly to C users, and b) it 
doesn't help with backwards compatability.

(The class above could potentially be implemented in a pxd using some of 
the same (hypothetical/planned) features as in my NumPy project 
proposal. Though native Cython support wouldn't hurt either for such a 
feature.)

-- 
Dag Sverre

_______________________________________________
Cython-dev mailing list
[email protected]
http://codespeak.net/mailman/listinfo/cython-dev

Reply via email to