On Thu, Nov 13, 2008 at 2:22 PM, dpapathanasiou <[EMAIL PROTECTED]> wrote: > I have some old Common Lisp functions I'd like to rewrite in Python > (I'm still new to Python), and one thing I miss is not having to > declare local variables. > > For example, I have this Lisp function: > > (defun random-char () > "Generate a random char from one of [0-9][a-z][A-Z]" > (if (< 50 (random 100)) > (code-char (+ (random 10) 48)) ; ascii 48 = 0 > (code-char (+ (random 26) (if (< 50 (random 100)) 65 97))))) ; > ascii 65 = A, 97 = a > > My Python version looks like this: > > def random_char (): > '''Generate a random char from one of [0-9][a-z][A-Z]''' > if random.randrange(0, 100) > 50: > return chr( random.randrange(0, 10) + 48 ) # ascii 48 = 0 > else: > offset = 65 # ascii 65 = A > if random.randrange(0, 100) > 50: > offset = 97 # ascii 97 = a > return chr( random.randrange(0, 26) + offset ) > > Logically, it's equivalent of the Lisp version. > > But is there any way to avoid using the local variable (offset) in the > Python version?
Any time you port between languages, it's rarely a good idea to just convert code verbatim. For example: import random, string def random_char(): return random.choice(string.ascii_letters + string.digits) -- http://mail.python.org/mailman/listinfo/python-list