Hi all,
I have two network interface card on one node, one is a Eithernet card,
the other Infiniband HCA.
The master has two IP addresses, lets say 1.2.3.4 (for Eithernet card)
and 2.3.4.5 (for HCA).
I can start the master by
export SPARK_MASTER_IP='1.2.3.4';sbin/start-master.sh
to let master
://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually
while starting the worker. like:
spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker --ip
1.2.3.4 spark://1.2.3.4:7077 http://1.2.3.4:7077
Thanks
Best Regards
On Fri, Oct 24, 2014 at 12:34 PM, Theodore
Can anyone help me, please?
在 10/14/2014 9:58 PM, Theodore Si 写道:
Hi all,
I have two nodes, one as master(*host1*) and the other as
worker(*host2*). I am using the standalone mode.
After starting the master on host1, I run
$ export MASTER=spark://host1:7077
$ bin/run-example SparkPi 10
Hi all,
I have two nodes, one as master(*host1*) and the other as
worker(*host2*). I am using the standalone mode.
After starting the master on host1, I run
$ export MASTER=spark://host1:7077
$ bin/run-example SparkPi 10
on host2, but I get this:
14/10/14 21:54:23 WARN TaskSchedulerImpl:
Hi all,
I want to use two nodes for test, one as master, the other worker.
Can I submit the example application included in Spark source code
tarball on master to let it run on the worker?
What should I do?
BR,
Theo
-
To
to the master - clustermanager) and the workers
will execute it.
Thanks
Best Regards
On Fri, Oct 10, 2014 at 2:47 PM, Theodore Si sjyz...@gmail.com wrote:
Hi all,
I want to use two nodes for test, one as master, the other worker.
Can I submit the example application included in Spark source
Should I pack the example into a jar file and submit it on master?
On Fri, Oct 10, 2014 at 9:32 PM, Theodore Si sjyz...@gmail.com wrote:
But I cannot do this via using
./bin/run-example SparkPi 10
right?
On Fri, Oct 10, 2014 at 6:04 PM, Akhil Das ak...@sigmoidanalytics.com
wrote
Hi,
Let's say that I managed to port Spark from TCP/IP to RDMA.
What tool or benchmark can I use to test the performance improvement?
BR,
Theo
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional
Hi all,
What tools should I use to benchmark SPARK applications?
BR,
Theo
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
How can I get figures like those in the Evaluation part of the following
paper?
http://www.cs.berkeley.edu/~matei/papers/2011/tr_spark.pdf
在 10/10/2014 10:35 AM, Theodore Si 写道:
Hi all,
What tools should I use to benchmark SPARK applications?
BR,
Theo
What can I get from it?
Can you show me some results please?
在 10/10/2014 10:46 AM, 牛兆捷 写道:
*You can try https://github.com/databricks/spark-perf*
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional
Hi,
Please help me with that.
BR,
Theodore Si
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
12 matches
Mail list logo