Hi all,
What is the most common used tool/product to benchmark spark job?
I have following hadoop spark cluster nodes configuration:
Nodes 1 2 are resourceManager and nameNode respectivly
Nodes 3, 4, and 5 each includes nodeManager dataNode
Node 7 is Spark-master configured to run yarn-client or yarn-master modes
I have tested it and it works fine.
Is there any
why not try https://github.com/linkedin/camus - camus is kafka to HDFS
pipeline
On Tue, May 5, 2015 at 11:13 PM, Rendy Bambang Junior
rendy.b.jun...@gmail.com wrote:
Hi all,
I am planning to load data from Kafka to HDFS. Is it normal to use spark
streaming to load data from Kafka to HDFS?
Hi all
Is there any good documentation on how to integrate spark with Hue 3.7.x?
Is the only way to install spark Job Server?
Thanks in advance for your help
- HI all,
- Application running and completed count does not get updated, it is
always zero. I have ran
- SparkPi application at least 10 times. please help
-
- *Workers:* 3
- *Cores:* 24 Total, 0 Used
- *Memory:* 43.7 GB Total, 0.0 B Used
- *Applications:* 0 Running, 0
are probably not connecting to your existing cluster and
instead running in local mode. Are you passing the master URL to the
SparkPi application?
Andrew
On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . afsan...@gmail.com wrote:
- HI all,
- Application running and completed count does not get
to pass
MASTER=spark://sanjar-local-machine-1:7077
before running your sparkPi example.
Thanks
Best Regards
On Tue, Jun 3, 2014 at 1:12 PM, MrAsanjar . afsan...@gmail.com wrote:
Thanks for your reply Andrew. I am running applications directly on the
master node. My cluster also