Re: Apache Spark on Raspberry Pi Cluster with Docker

2015-11-03 Thread Akhil Das
Can you try it with just:

spark-submit --master spark://master:6066 --class SimpleApp
target/simple-project-1.0.jar

And see if it works?

Even better idea would be to spawn a spark-shell (*MASTER=spark://master:6066
bin/spark-shell*) and try out a simple *sc.parallelize(1 to 1000).collect*



Thanks
Best Regards

On Wed, Oct 28, 2015 at 7:53 PM, Mark Bonnekessel <mar...@mailbox.org>
wrote:

> Hi,
>
> we are trying to setup apache spark on a raspberry pi cluster for
> educational use.
> Spark is installed in a docker container and all necessary ports are
> exposed.
>
> After we start master and workers, all workers are listed as alive in the
> master web ui (http://master:8080).
>
> I want to run the SimpleApp-Example from the spark-homepage (
> http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications)
>  on
> the cluster to verify that everything is working, but i cannot get it run.
>
> I built an jar-file and submitted the application with spark-submit and
> get the following output:
> spark-submit --master spark://master:6066 --deploy-mode cluster --class
> SimpleApp target/simple-project-1.0.jar
> Running Spark using the REST application submission protocol.
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request to
> launch an application in spark://localhost:6066.
> 15/10/28 12:54:43 INFO RestSubmissionClient: Submission successfully
> created as driver-20151028115443-0002. Polling submission state...
> 15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request for the
> status of submission driver-20151028115443-0002 in spark://localhost:6066.
> 15/10/28 12:54:43 INFO RestSubmissionClient: State of driver
> driver-20151028115443-0002 is now SUBMITTED.
> 15/10/28 12:54:43 INFO RestSubmissionClient: Server responded with
> CreateSubmissionResponse:
> {
>   "action" : "CreateSubmissionResponse",
>   "message" : "Driver successfully submitted as
> driver-20151028115443-0002",
>   "serverSparkVersion" : "1.5.1",
>   "submissionId" : "driver-20151028115443-0002",
>   "success" : true
> }
>
> The driver is created correctly, but it never starts the application.
> What am i missing?
>
> Regards,
> Mark
>


Apache Spark on Raspberry Pi Cluster with Docker

2015-10-28 Thread Mark Bonnekessel
Hi,

we are trying to setup apache spark on a raspberry pi cluster for educational 
use.
Spark is installed in a docker container and all necessary ports are exposed.

After we start master and workers, all workers are listed as alive in the 
master web ui (http://master:8080 <http://master:8080/>).

I want to run the SimpleApp-Example from the spark-homepage 
(http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
 
<http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications>)
 on the cluster to verify that everything is working, but i cannot get it run.

I built an jar-file and submitted the application with spark-submit and get the 
following output:
spark-submit --master spark://master:6066  --deploy-mode 
cluster --class SimpleApp target/simple-project-1.0.jar
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request to launch an 
application in spark://localhost:6066 .
15/10/28 12:54:43 INFO RestSubmissionClient: Submission successfully created as 
driver-20151028115443-0002. Polling submission state...
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request for the 
status of submission driver-20151028115443-0002 in spark://localhost:6066 
.
15/10/28 12:54:43 INFO RestSubmissionClient: State of driver 
driver-20151028115443-0002 is now SUBMITTED.
15/10/28 12:54:43 INFO RestSubmissionClient: Server responded with 
CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20151028115443-0002",
  "serverSparkVersion" : "1.5.1",
  "submissionId" : "driver-20151028115443-0002",
  "success" : true
}

The driver is created correctly, but it never starts the application.
What am i missing?

Regards,
Mark

Apache Spark on Raspberry Pi Cluster with Docker

2015-10-28 Thread Mark Bonnekessel
Hi,

we are trying to setup apache spark on a raspberry pi cluster for educational 
use.
Spark is installed in a docker container and all necessary ports are exposed.

After we start master and workers, all workers are listed as alive in the 
master web ui (http://master:8080 <http://master:8080/>).

I want to run the SimpleApp-Example from the spark-homepage 
(http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
 
<http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications>)
 on the cluster to verify that everything is working, but i cannot get it run.

I built an jar-file and submitted the application with spark-submit and get the 
following output:
spark-submit --master spark://master:6066  --deploy-mode 
cluster --class SimpleApp target/simple-project-1.0.jar
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request to launch an 
application in spark://localhost:6066 .
15/10/28 12:54:43 INFO RestSubmissionClient: Submission successfully created as 
driver-20151028115443-0002. Polling submission state...
15/10/28 12:54:43 INFO RestSubmissionClient: Submitting a request for the 
status of submission driver-20151028115443-0002 in spark://localhost:6066 
.
15/10/28 12:54:43 INFO RestSubmissionClient: State of driver 
driver-20151028115443-0002 is now SUBMITTED.
15/10/28 12:54:43 INFO RestSubmissionClient: Server responded with 
CreateSubmissionResponse:
{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20151028115443-0002",
  "serverSparkVersion" : "1.5.1",
  "submissionId" : "driver-20151028115443-0002",
  "success" : true
}

The driver is created correctly, but it never starts the application.
What am i missing?

Regards,
Mark

Spark on Raspberry Pi?

2014-09-11 Thread Sandeep Singh
Has anyone tried using Raspberry Pi for Spark? How efficient is it to use
around 10 Pi's for local testing env ?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Raspberry-Pi-tp13965.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Fwd: Spark on Raspberry Pi?

2014-09-11 Thread Chen He
Pi's bus speed, memory size and access speed, and processing ability are
limited. The only benefit could be the power consumption.

On Thu, Sep 11, 2014 at 8:04 AM, Sandeep Singh sand...@techaddict.me
wrote:

 Has anyone tried using Raspberry Pi for Spark? How efficient is it to use
 around 10 Pi's for local testing env ?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Raspberry-Pi-tp13965.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark on Raspberry Pi?

2014-09-11 Thread Daniil Osipov
Limited memory could also cause you some problems and limit usability. If
you're looking for a local testing environment, vagrant boxes may serve you
much better.

On Thu, Sep 11, 2014 at 6:18 AM, Chen He airb...@gmail.com wrote:




 Pi's bus speed, memory size and access speed, and processing ability are
 limited. The only benefit could be the power consumption.

 On Thu, Sep 11, 2014 at 8:04 AM, Sandeep Singh sand...@techaddict.me
 wrote:

 Has anyone tried using Raspberry Pi for Spark? How efficient is it to use
 around 10 Pi's for local testing env ?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Raspberry-Pi-tp13965.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org






Re: Spark on Raspberry Pi?

2014-09-11 Thread Aniket Bhatnagar
Just curiois... What's the use case you are looking to implement?
On Sep 11, 2014 10:50 PM, Daniil Osipov daniil.osi...@shazam.com wrote:

 Limited memory could also cause you some problems and limit usability. If
 you're looking for a local testing environment, vagrant boxes may serve you
 much better.

 On Thu, Sep 11, 2014 at 6:18 AM, Chen He airb...@gmail.com wrote:




 Pi's bus speed, memory size and access speed, and processing ability are
 limited. The only benefit could be the power consumption.

 On Thu, Sep 11, 2014 at 8:04 AM, Sandeep Singh sand...@techaddict.me
 wrote:

 Has anyone tried using Raspberry Pi for Spark? How efficient is it to use
 around 10 Pi's for local testing env ?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Raspberry-Pi-tp13965.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org







Re: Spark on Raspberry Pi?

2014-09-11 Thread Chanwit Kaewkasi
We've found that Raspberry Pi is not enough for Hadoop/Spark mainly
because the memory consumption. What we've built is a cluster form
with 22 Cubieboards, each contains 1 GB RAM.

Best regards,

-chanwit

--
Chanwit Kaewkasi
linkedin.com/in/chanwit


On Thu, Sep 11, 2014 at 8:04 PM, Sandeep Singh sand...@techaddict.me wrote:
 Has anyone tried using Raspberry Pi for Spark? How efficient is it to use
 around 10 Pi's for local testing env ?



 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Raspberry-Pi-tp13965.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org