Which version of Spark are you using?
Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi https://twitter.com/mayur_rustagi
On Mon, Mar 10, 2014 at 6:49 PM, abhinav chowdary
abhinav.chowd...@gmail.com wrote:
for any one who is interested to know about job
0.8.1 we used branch 0.8 and pull request into our local repo. I remember
we have to deal with few issues but once we are thought that its working
great.
On Mar 10, 2014 6:51 PM, Mayur Rustagi mayur.rust...@gmail.com wrote:
Which version of Spark are you using?
Mayur Rustagi
Ph: +1 (760)
Are you using it with HDFS? What version of Hadoop? 1.0.4?
Ognen
On 3/10/14, 8:49 PM, abhinav chowdary wrote:
for any one who is interested to know about job server from Ooyala..
we started using it recently and been working great so far..
On Feb 25, 2014 9:23 PM, Ognen Duzlevski
hdfs 1.0.4 but we primarily use Cassandra + Spark (calliope). I tested it
with both
Are you using it with HDFS? What version of Hadoop? 1.0.4?
Ognen
On 3/10/14, 8:49 PM, abhinav chowdary wrote:
for any one who is interested to know about job server from Ooyala.. we
started using it recently and
Hi,
I am looking for ways to share the sparkContext, meaning i need to
be able to perform multiple operations on the same spark context.
Below is code of a simple app i am testing
def main(args: Array[String]) {
println(Welcome to example application!)
val sc = new
fair scheduler merely reorders tasks .. I think he is looking to run
multiple pieces of code on a single context on demand from customers...if
the code order is decided then fair scheduler will ensure that all tasks
get equal cluster time :)
Mayur Rustagi
Ph: +919632149971
h