Hi,

I found this great tutorial from Ikai. it explains how to use the
Mapper API to import large CSV files from the blob store to the
datastore.
So it is essentially what I was looking for.
Here the link for those who might have the same need:
http://ikaisays.com/2010/08/11/using-the-app-engine-mapper-for-bulk-data-import

Cheers,
Toby

On Dec 2, 6:53 pm, Didier Durand <durand.did...@gmail.com> wrote:
> Hi,
>
> you have to distinguish between a Blob stored in Datastore whose limit
> is 1 Mbyte and a blob stored in the blobstore whose limit is 2 Gbytes
>
> For the blob in the blobstore you access it by chunks via the api: see
> the byte range 
> inhttp://code.google.com/appengine/docs/java/blobstore/overview.html#Se...
>
> To read those chunks, you have to use the BlobstoreInputStream and its
> read() method. 
> Seehttp://code.googlecom/appengine/docs/java/javadoc/com/google/appengin...
>
> regards
> didier
>
> On Dec 2, 6:23 pm, Toby <toby.ro...@gmail.com> wrote:
>
> > Hi,
>
> > I am trying to figure out how to programatically process a file from
> > the BlobStore. It is a CSV file of about 100MB. I read that the limit
> > is 1MB but I have seen in Pyton that it can be read partially (they
> > call it BlobReader).
> > Is this a feature currently not available in Java? Or is there a way I
> > can process this data step by step using the task-queue?
>
> > Thank you,
> > Toby

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine for Java" group.
To post to this group, send email to google-appengine-j...@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine-java+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine-java?hl=en.

Reply via email to