You're gonna need to isolate your program's persistence API and then write an implementation of that specifically for GAE. Depending on your budget and load, you may be able to persist your data into BigTable using normal GAE persistence. Then you could make that data available with loading logic that looks up your data sets in BigTable and turns them into CSV streams. Another place you could persist it is to marshal your data into CSV or some other format and dump it into BlobStore. Again, you'll need to construct a way to serve the data back. Nice thing about BlobStore is that if you marshal the data into CSV, you can have give a client a URL that will serve the data out of BlobStore directly. The nice thing about BigTable is that depending on your filtering or other runtime presentation requirements, you can slice and dice data, provided you index it appropriately.
On Friday, December 7, 2012 8:30:14 PM UTC-5, Jason Hsu wrote: > > The app I'm working on processes information on over 5000 stocks and saves > the output in *.csv files. > > What I'm trying to do is get my app to automatically run nightly on a > remote server and make the *.csv files publicly accessible. > > How do I go about doing this with Google App Engine? The Python 2.7 > tutorial revolves around creating a web site with HTML pages. I just want > to run my script at regular intervals and make the output files publicly > viewable. > > Is Google App Engine for me, or would another solution be more appropriate? > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To view this discussion on the web visit https://groups.google.com/d/msg/google-appengine/-/armkoir_rskJ. To post to this group, send email to google-appengine@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.