> I'm trying to process 100-150 jpgs that are 2-4 mb into smaller files. > What advice/best practices can you guys share in regards to memory > management? I can make my job go through about 5 before it blows up, > and yes, i've upped the memory limit. I need some method to clear out > memory after each loop iteration... Any ideas?
If you want a quick fix, just write the output file after processing each file, and set the variable that stores the image to null. This should automatically happen if you reuse the same variables to store the image. I am curious to see how you coded this so the memory grows with each additional image. If it is a one time job, you should run the script from the command line (no time limit, or memory limit). This is not a realistic task for a web request, if you want to implement background processing then a scheduler/queue is the way to go. -John Campbell _______________________________________________ New York PHP Community Talk Mailing List http://lists.nyphp.org/mailman/listinfo/talk NYPHPCon 2006 Presentations Online http://www.nyphpcon.com Show Your Participation in New York PHP http://www.nyphp.org/show_participation.php
