> > > > I have a simple question. I wish to generate an array of file pointers. > > For example, I have files: > > > > data1.txt > > data2.txt > > data3.txt > > .... > > > > I wish to generate fine pointer array so that I can read the files at > > the same time. > > > > for index in range(N): > > fid[index] = open('data%d.txt' % index,'r')
> You can let the files be closed by the garbage collector, but there is no > guarantee that this will happen in a timely manner. Best practice is to > close them manually once you are done. I personally prefer to use context managers when possible. Not sure it will work for you, but I would probably just read all data into memory first by doing. I imagine file access would be faster read all at once rather than switching between files unless you have very large files. data= [] for index in range(N, 1): # see Chris Rebert's comment with open('data%d.txt' % index,'r') as f: data.append( f.readlines() ) Ramit Ramit Prasad | JPMorgan Chase Investment Bank | Currencies Technology 712 Main Street | Houston, TX 77002 work phone: 713 - 216 - 5423 -- This email is confidential and subject to important disclaimers and conditions including on offers for the purchase or sale of securities, accuracy and completeness of information, viruses, confidentiality, legal privilege, and legal entity disclaimers, available at http://www.jpmorgan.com/pages/disclosures/email. -- http://mail.python.org/mailman/listinfo/python-list