Have you tried splitting the file in 80MB Files, zip them, then
transfer them to de mobile, unzip al the splitted files and finally
merge them?
Just a thought.
Best regards.
El 2013-05-09 08:07, Deepak MS escribió:
Hi there,
Recently I had been working on an iPad app using Flex\AIR, which
downloads
a database file on the device and works as an offline app.
But since file was very huge(150 - 500MB), I was asked if we can zip
the
file to cut down the download time on the device. I did try using
Airxzip,
FZip etc. But all of them ultimately use ByteArray.uncompress() method
to
unzip the file. It works just fine in case of desktop\web apps. But
when it
comes to mobile\iPad apps, memory is a huge constraint. When I tried
to
unzip a zipped file(size was 20MB, uncompressed size was around 80MB)
using
ByteArray.uncompress(), it worked fine on the device with 512MB RAM.
But
When I try to unzip a zipped file beyond 100MB(size was 100MB,
uncompressed
size was around 320MB), the app gets crashed.
I felt, since uncompress method tries to unzip the file in a single
go, it
runs out of memory and apparently the app crashes.
Just wanted to know if there is any way we can unzip those huge files
byte
wise? That is, instead of unzipping the entire file in one single
shot, can
we keep some predefined buffer size and unzip them? Something like
this,
which I reckon is in Java:
http://stackoverflow.com/questions/14218178/how-to-unzip-large-zip-file-write-to-sdcard-in-less-time
ZipArchive for iOS too works very fine for huge file unzipping.
I still haven't got any alternative solution to it. I am downloading
those
huge raw files direclty and using them without unzipping. It's really
a
pain for the customers to wait for 2 - 3 hours to download the file.
Any thoughts on how we can go about it?
-Deepak