Returning to this original problem, I have modified my program from a
single long procedure to
3 functions which do the following:

serialize_pipeline_model(f): takes as input a file, reads it and parses
coordinate values
(numerical entries in the file) into a list

write_to_binary(): writes the generated list to a binary file (pickles it)

read_binary(): unpickles the aggregate of merged lists that should be one
large list.

The code goes like so:

******
z_coords1 = []

def serialize_pipeline_model(f):
  ....
  .....
  #  z_coords1 = [] has been declared global
global z_coords1
charged_groups = lys_charged_group + arg_charged_group + his_charged_group
+ asp_charged_group + glu_charged_group
for i in range(len(charged_groups)):
z_coords1.append(float(charged_groups[i][48:54]))

#print z_coords1
return z_coords1

import pickle, shelve
print '\nPickling z-coordinates list'

def write_to_binary():
""" iteratively write successively generated z_coords1 to a binary file """
f = open("z_coords1.dat", "ab")
pickle.dump(z_coords1, f)
f.close()
return

def read_binary():
""" read the binary list """
print '\nUnpickling z-coordinates list'
f = open("z_coords1.dat", "rb")
z_coords1=pickle.load(f)
print(z_coords1)
f.close()
return

### LOOP OVER DIRECTORY
for f in
os.listdir('/Users/spyros/Desktop/3NY8MODELSHUMAN/HomologyModels/'):
serialize_pipeline_model(f)
write_to_binary()

read_binary()
print '\n Z-VALUES FOR ALL CHARGED RESIDUES'
print z_coords1
******

The problem is that the list (z_coords1) returns as an empty list. I know
the code works (too large to post here)
in a procedural format (z_coords1 can be generated correctly), so as a
diagnostic I included a print statement
in the serialize function to see that the list that is generated for each
of the 500 files.

Short of some intricacy with the scopes of the program I may be missing, I
am not sure why this is happening? Deos anybody have
any ideas? Many thanks for your time.

Best regards,
Spyros


On Fri, May 18, 2012 at 7:23 PM, Spyros Charonis <s.charo...@gmail.com>wrote:

> Dear Python community,
>
> I have a set of ~500 files which I would like to run a script on. My
> script extracts certain information and
> generates several lists with items I need. For one of these lists, I need
> to combine the information from all
> 500 files into one super-list. Is there a way in which I can iteratively
> execute my script over all 500 files
> and get them to write the list I need into a new file? Many thanks in
> advance for your time.
>
> Spyros
>
_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
http://mail.python.org/mailman/listinfo/tutor

Reply via email to