Clarence <[EMAIL PROTECTED]> writes:

> When you move your application to a 64-bit system in order to get a
> bigger address space to store your millions/billions of integers in
> RAM, but they get twice as big, you don't gain very much.

I don't think changing the underlying type will help at all.  The
layout of the int object is as follows:

typedef struct {
    // PyObject_HEAD
    Py_ssize_t ob_refcnt;
    struct _typeobject *ob_type;
    // the actual value
    long ob_ival;
} PyIntObject;

On a 64-bit machine, that's 16 bytes for PyObject_HEAD and 8 more
bytes for the value, 24 bytes total.  Changing long to int won't
decrease the struct size to 20 because the compiler will pad it to 24,
the nearest multiple of 8.  (Forcing the compiler to pack the struct
won't help because malloc will still pad it for you.)

If you need to store millions of integers compactly, maybe
array.array('i') can help.
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to