I have a Python script that automatically downloads zip files containing large datasets from another server and then unzips the files to further process the data. It has been used as a geoprocessor of ArcGIS Server. The script works fine when two datasets each has several kilobytes size, but the script stops half way when datasets were about 11,000KBytes. I think that the execution time is too long and ArcGIS Server just simply killed the process. What actions can I try to reduce the execution time? ArcGIS Server only works on the basis of 32 bits and I was told that the maximum memory it can utilise is 4 MBytes. I should be grateful if someone can make suggestions/recommendations. Sincerely, David
_______________________________________________ Web-SIG mailing list Web-SIG@python.org Web SIG: http://www.python.org/sigs/web-sig Unsubscribe: http://mail.python.org/mailman/options/web-sig/archive%40mail-archive.com