Hi all,

I'd like to create a static web server to store almost 1 TB of images.

It is an opensource dataset that I'd like to use to train a Deep Learning model.

I have free usage of GPUs and Internet conexion in another plattform, but they 
don't provide me 1 TB storage.

I've also 600$ credits in Google Cloud, I was wondering if there was an easy 
way to create something to feed with images the server in the other plattform.

The datasource is available as an AWS bucket.  I tried to connect the GPU 
machine directly to the ASW bucket via awscli, but it is too much slow.  Like 
if the bucket were thought for a complete sync but not for coninuous random 
access to the files.

I've though two possible approaches:

        - Execute a python script in GAE to download the dataset and to create 
a GAE web server: 
https://cloud.google.com/appengine/docs/standard/python/getting-started/hosting-a-static-website

        - Execute a python script in GAE to download the dataset and to create 
a Google Cloud CDN.

Do you think any of this approaches are valid to feed the model during the 
training?

I'm a newbie in GAE and any help, starting point or idea will be very wellcomed.

Thanks in advance

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to google-appengine+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/dbd0a8f8-859b-4f50-a108-80b21e27267f%40googlegroups.com.

Reply via email to