Re: Re: application as a service

2014-08-18 Thread Zhanfeng Huo
That helps a lot. Thanks. Zhanfeng Huo From: Davies Liu Date: 2014-08-18 14:31 To: ryaminal CC: u...@spark.incubator.apache.org Subject: Re: application as a service Another option is using Tachyon to cache the RDD, then the cache can be shared by different applications. See how to use

Re: application as a service

2014-08-17 Thread Davies Liu
Another option is using Tachyon to cache the RDD, then the cache can be shared by different applications. See how to use Spark with Tachyon: http://tachyon-project.org/Running-Spark-on-Tachyon.html Davies On Sun, Aug 17, 2014 at 4:48 PM, ryaminal wrote: > You can also look into using ooyala's j

Re: Re: application as a service

2014-08-17 Thread Zhanfeng Huo
Thank you Eugen Cepoi, I will try it now. Zhanfeng Huo From: Eugen Cepoi Date: 2014-08-17 23:34 To: Zhanfeng Huo CC: user Subject: Re: application as a service Hi, You can achieve it by running a spray service for example that has access to the RDD in question. When starting the app you

Re: application as a service

2014-08-17 Thread ryaminal
You can also look into using ooyala's job server at https://github.com/ooyala/spark-jobserver This already has a spary server built in that allows you to do what has already been explained above. Sounds like it should solve your problem. Enjoy! -- View this message in context: http://apache-

Re: application as a service

2014-08-17 Thread Eugen Cepoi
Hi, You can achieve it by running a spray service for example that has access to the RDD in question. When starting the app you first build your RDD and cache it. In your spray "endpoints" you will translate the HTTP requests to operations on that RDD. 2014-08-17 17:27 GMT+02:00 Zhanfeng Huo :