On 25 srp, 01:07, Ben Finney <[EMAIL PROTECTED]>
wrote:
> Steve Holden <[EMAIL PROTECTED]> writes:
> > [EMAIL PROTECTED] wrote:
> > > On 24 srp, 05:20, "Gabriel Genellina" <[EMAIL PROTECTED]> wrote:
> > >> En Mon, 23 Jul 2007 16:53:01 -0300, ...:::JA:::...
> > >> <[EMAIL PROTECTED]> escribió:
> > >>> So......how can I do this?????????????
> > >>> I will appreciate any help!!!!!
> > >> Try with a simple example. Let's say you want to convert this:
> > >> [...]
> > > [...] Can you give me some example script of this? Please!!!
>
> > > PS:   THANKS FOR YOUR TIME!!!!!!!!!!
>
> > It's unfortunate that you are having difficulty with two languages
> > simultaneously: your command of English, though impressive, appears
> > to be insufficient for you to explain the problem [...]
>
> And while we're on the topic of communication: The original poster
> would do well to learn that increasing the number of consecutive
> punctuation marks (!!!, ???) is a sure way to turn away many people
> who would otherwise be helpful. Sentences need at most one '!' or '?',
> adding more does not improve the chances of being taken seriously.
>
> --
>  \        "We have to go forth and crush every world view that doesn't |
>   `\             believe in tolerance and free speech."  -- David Brin |
> _o__)                                                                  |
> Ben Finney

Hi,

I was only ask for help becose I don't understand this tokenize module
so well.
And if you can help me than please help me , but if you can't then
please don't leave me some stupid
messages

 
Regards,
 
Vedran

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to