Re: Can not see any spark metrics on ganglia-web

2015-12-08 Thread SRK
Hi,

Should the gmond be installed in all the Spark nodes? What should the host
and port be? Should it be the host and port of gmetad?

 Enable GangliaSink for all instances 
*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink 
*.sink.ganglia.name=hadoop_cluster1 
*.sink.ganglia.host=localhost 
*.sink.ganglia.port=8653 
*.sink.ganglia.period=10 
*.sink.ganglia.unit=seconds 
*.sink.ganglia.ttl=1 
*.sink.ganglia.mode=multicast 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p25636.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2015-12-08 Thread SRK
Hi,

Where does *.sink.csv.directory  directory get created? I cannot see nay
metrics in logs. How did you verify consoleSink and csvSink?

Thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p25643.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2015-12-08 Thread SRK
Hi,

I cannot see any metrics as well. How did you verify  ConsoleSink and
CSVSink works OK? Where does *.sink.csv.directory  get created?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p25644.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2014-12-04 Thread danilopds
I used the command below because I'm using Spark 1.0.2 built with SBT and it
worked. 

SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true SPARK_GANGLIA_LGPL=true sbt/sbt
assembly



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p20384.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2014-10-02 Thread danilopds
Hi tsingfu,

I want to see metrics in ganglia too.
But I don't understand this step:
./make-distribution.sh --tgz --skip-java-test -Phadoop-2.3 -Pyarn -Phive
-Pspark-ganglia-lgpl 

Are you installing the hadoop, yarn, hive AND ganglia??

If I want to install just ganglia?
Can you suggest me something?

Thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p15631.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2014-10-02 Thread Krishna Sankar
Hi,
   I am sure you can use the -Pspark-ganglia-lgpl switch to enable Ganglia.
This step only adds the support for Hadoop,Yarn,Hive et al in the spark
executable.No need to run if one is not using them.
Cheers
k/

On Thu, Oct 2, 2014 at 12:29 PM, danilopds danilob...@gmail.com wrote:

 Hi tsingfu,

 I want to see metrics in ganglia too.
 But I don't understand this step:
 ./make-distribution.sh --tgz --skip-java-test -Phadoop-2.3 -Pyarn -Phive
 -Pspark-ganglia-lgpl

 Are you installing the hadoop, yarn, hive AND ganglia??

 If I want to install just ganglia?
 Can you suggest me something?

 Thanks!



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p15631.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Can not see any spark metrics on ganglia-web

2014-10-02 Thread danilopds
Ok Krishna Sankar,

In relation to this information on Spark monitoring webpage,
For sbt users, set the SPARK_GANGLIA_LGPL environment variable before
building. For Maven users, enable the -Pspark-ganglia-lgpl profile

Do you know what I need to do to install with sbt?
Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p15636.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Can not see any spark metrics on ganglia-web

2014-09-25 Thread tsingfu
Hi, I found the problem.
By default, gmond is monitoring the multicast ip:239.2.11.71, while I set
*.sink.ganglia.host=localhost.

the correct configuration in metrics.properties:
# Enable GangliaSink for all instances
*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink
#*.sink.ganglia.host=localhost
*.sink.ganglia.host=239.2.11.71
*.sink.ganglia.port=8653
*.sink.ganglia.period=10
*.sink.ganglia.unit=seconds
*.sink.ganglia.ttl=1
*.sink.ganglia.mode=multicast




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981p15128.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Can not see any spark metrics on ganglia-web

2014-09-23 Thread tsingfu
I installed ganglia, and I think it worked well for hadoop, hbase for I can
see hadoop/hbase metrics on ganglia-web.I want to use ganglia to monitor
spark. and I followed the steps as following:1) first I did a custom compile
with -Pspark-ganglia-lgpl, and it sucessed without
warnings../make-distribution.sh --tgz --skip-java-test -Phadoop-2.3 -Pyarn
-Phive -Pspark-ganglia-lgpl2)I configured the conf/metrics.properties:(8653
is the port I set for gmond) and restart spark Master and Workervi
conf/metrics.properties# Enable GangliaSink for all
instances*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink*.sink.ganglia.name=hadoop_cluster1*.sink.ganglia.host=localhost*.sink.ganglia.port=8653*.sink.ganglia.period=10*.sink.ganglia.unit=seconds*.sink.ganglia.ttl=1*.sink.ganglia.mode=multicastsbin/stop-all.shsbin/start-all.sh3)
I refreshed my ganglia-web,but I can not see any spark metrics.4) I made a
test to verify whether the sinks of ConsoleSink and CSVSink works OK, and
the result is OK, I found metrics in logs and *.sink.csv.directoryI searched
topic about ganglia and metrics on the
http://apache-spark-user-list.1001560.n3.nabble.com ,spark JIRA and google,
but found anything useful.Any one could give me a help or some proposal?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.