Re: Working with large files and Memory
One more thing, if you can require 10.5, you can build a 64-bit app. You should *still* avoid unnecessary allocations, and likely read the file in chunks, to lower the amount of RAM your users will require in order to avoid too much VM swapping. -- Scott Ribe [EMAIL PROTECTED] http://www.killerbytes.com/ (303) 722-0567 voice ___ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]
Re: Working with large files and Memory
Thank you to both for your good advice. I will look into this. Carl. On Tuesday, March 11, 2008, at 04:49PM, "Jens Alfke" <[EMAIL PROTECTED]> wrote: > >On 11 Mar '08, at 10:18 AM, Jean-Daniel Dupas wrote: > >> The first advice I can give you is "do not load the whole file into >> memory". > >Absolutely. > >> Use read stream to read chunk of data and process them. (see >> NSInputStream or NSFileHandle). > >Or if the file is simple ascii text with newlines, you can use basic C >stdio calls (fopen, fgets, fclose) to read a line at a time. You can >either convert the line into an NSString, or just use something like >sscanf to parse it. > >In rare situations where you absolutely do have to load a huge file >into memory, i.e. for an algorithm that requires random access, your >best bet is to memory-map it. -[NSData >dataWithContentsOfFile:options:] has an option flag to map the file. >This will avoid a lot of copying, but it's still subject to the same >address-space limit if your process is 32-bit, so don't expect to be >able to load anything much over a gigabyte. > >?Jens > ___ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]
Re: Working with large files and Memory
On 11 Mar '08, at 10:18 AM, Jean-Daniel Dupas wrote: The first advice I can give you is "do not load the whole file into memory". Absolutely. Use read stream to read chunk of data and process them. (see NSInputStream or NSFileHandle). Or if the file is simple ascii text with newlines, you can use basic C stdio calls (fopen, fgets, fclose) to read a line at a time. You can either convert the line into an NSString, or just use something like sscanf to parse it. In rare situations where you absolutely do have to load a huge file into memory, i.e. for an algorithm that requires random access, your best bet is to memory-map it. -[NSData dataWithContentsOfFile:options:] has an option flag to map the file. This will avoid a lot of copying, but it's still subject to the same address-space limit if your process is 32-bit, so don't expect to be able to load anything much over a gigabyte. —Jens smime.p7s Description: S/MIME cryptographic signature ___ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]
Re: Working with large files and Memory
Le 11 mars 08 à 17:54, Carl E. McIntosh a écrit : Can you please give advice about handling large data files with memory management techniques? I am attempting to read three large files (1 GB, 208 MB, 725 MB) sequentially and place the data into arrays for processing. Here is my psuedocode: 1) Import a file into NSString. NSString *aFileString = [NSString stringWithContentsOfFile: fileLocation]; // Convert file at path to myFileString text holder; 2) Use NSScanner pull out integers and floats NSScanner *aFileScanner = [[NSScanner alloc] initWithString: aFileString]; 3) Store values into arrays. float myFloats [10][2000]; or float myInts [10][2000]; 4) repeat three times with 3 different files. This algorithm works for smaller files but chokes on the larger files and I get malloc errors. I've attempted to use NSZone's to the same failure. Can you please give advice about handling large data files with memory management techniques? I have 4 GB ram and can hog off 2 - 3 GBs for the process. I don't know how to explicitly allocate real memory. I'd rather not use virtual memory. Any references or examples would be appreciated. The first advice I can give you is "do not load the whole file into memory". Use read stream to read chunk of data and process them. (see NSInputStream or NSFileHandle). Maybe other people on this list may have other advices too. ___ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]
Re: Working with large files and Memory
On Tue, Mar 11, 2008 at 9:54 AM, Carl E. McIntosh <[EMAIL PROTECTED]> wrote: > Can you please give advice about handling large data files with memory > management techniques? I am attempting to read three large files (1 > GB, 208 MB, 725 MB) sequentially and place the data into arrays for > processing. Here is my psuedocode: > > 1) Import a file into NSString. >NSString *aFileString = [NSString stringWithContentsOfFile: > fileLocation]; // Convert file at path to myFileString text holder; Depending on the encoding of the file itself, this could end up allocating a block of memory twice the size of the file. > 2) Use NSScanner pull out integers and floats >NSScanner *aFileScanner = [[NSScanner alloc] initWithString: > aFileString]; This has the potential of copying the passed in string (not likely to be a problem, as you string is immutable, but is still something to keep in mind). > > 3) Store values into arrays. >float myFloats [10][2000]; or >float myInts [10][2000]; Again, *another* huge block of memory. > 4) repeat three times with 3 different files. Depending on how this is done (how tight your autorelease pools are, etc.), the old string may not have been deallocated yet. > This algorithm works for smaller files but chokes on the larger files > and I get malloc errors. I've attempted to use NSZone's to the same > failure. > > Can you please give advice about handling large data files with memory > management techniques? You'll have to use less memory. Remember, that on a 32-bit machine, you will never be able to allocate more than 4GB of memory in your address space. On top of that, much of that memory is already used by the system's frameworks (usually between 1 and 2 GB). > I have 4 GB ram and can hog off 2 - 3 GBs for the process. I don't > know how to explicitly allocate real memory. I'd rather not use > virtual memory. Any references or examples would be appreciated. Memory doesn't work like that (i.e you cannot allocate "real" memory). The closest that you can get is to allocate memory and to "wire" it down; however, this is a horrible idea for allocations this large (you'll still run out of address space as before). Your best bet is to change your algorithm to not have to load the entire file at once (perhaps load it a line at a time, or in blocks of 4K or so). -- Clark S. Cox III [EMAIL PROTECTED] ___ Cocoa-dev mailing list (Cocoa-dev@lists.apple.com) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]