Re: How to keep RDDs in memory between two different batch jobs?

2015-07-22 Thread ericacm
Tachyon is one way. Also check out the Spark Job Server https://github.com/spark-jobserver/spark-jobserver . -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-keep-RDDs-in-memory-between-two-different-batch-jobs-tp23957p23958.html Sent from the

Re: How to keep RDDs in memory between two different batch jobs?

2015-07-22 Thread ericacm
Actually, I should clarify - Tachyon is a way to keep your data in RAM, but it's not exactly the same as keeping it cached in Spark. Spark Job Server is a way to keep it cached in Spark. -- View this message in context:

Re: Cannot run SimpleApp as regular Java app

2014-09-19 Thread ericacm
, Sep 18, 2014 at 11:58 AM, ericacm [via Apache Spark User List] ml-node+s1001560n14570...@n3.nabble.com wrote: Upgrading from spark-1.0.2-hadoop2 to spark-1.1.0-hadoop1 fixed my problem. -- If you reply to this email, your message will be added to the discussion

Re: Cannot run SimpleApp as regular Java app

2014-09-18 Thread ericacm
Upgrading from spark-1.0.2-hadoop2 to spark-1.1.0-hadoop1 fixed my problem. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-SimpleApp-as-regular-Java-app-tp13695p14570.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Cannot run SimpleApp as regular Java app

2014-09-09 Thread ericacm
Hi Yana - I added the following to spark-class: echo RUNNER: $RUNNER echo CLASSPATH: $CLASSPATH echo JAVA_OPTS: $JAVA_OPTS echo '$@': $@ Here's the output: $ ./spark-submit --class experiments.SimpleApp --master spark://myhost.local:7077

Cannot run SimpleApp as regular Java app

2014-09-08 Thread ericacm
Dear all: I am a brand new Spark user trying out the SimpleApp from the Quick Start page. Here is the code: object SimpleApp { def main(args: Array[String]) { val logFile = /dev/spark-1.0.2-bin-hadoop2/README.md // Should be some file on your system val conf = new SparkConf()

Re: Programatically running of the Spark Jobs.

2014-09-04 Thread ericacm
Ahh - that probably explains an issue I am seeing. I am a brand new user and I tried running the SimpleApp class that is on the Quick Start page (http://spark.apache.org/docs/latest/quick-start.html). When I use conf.setMaster(local) then I can run the class directly from my IDE. But when I try