how spark read hdfs data with kerberose

2014-08-06 Thread Zhanfeng Huo
does`t take effect, can you help me ? Zhanfeng Huo

Re: Re: Compile spark code with idea succesful but run SparkPi error with "java.lang.SecurityException"

2014-08-13 Thread Zhanfeng Huo
Thank you, Ton. That helps a lot. I want to debug spark code for tracing state transform. So I use sbt as my build tools and compile spark code in Intellij IDEA . Zhanfeng Huo From: Ron's Yahoo! Date: 2014-08-12 03:46 To: Zhanfeng Huo CC: user Subject: Re: Compile spark code with

application as a service

2014-08-17 Thread Zhanfeng Huo
p can access it's rdd. How can I achieve this require ? Thanks. Zhanfeng Huo

Re: Re: application as a service

2014-08-17 Thread Zhanfeng Huo
Thank you Eugen Cepoi, I will try it now. Zhanfeng Huo From: Eugen Cepoi Date: 2014-08-17 23:34 To: Zhanfeng Huo CC: user Subject: Re: application as a service Hi, You can achieve it by running a spray service for example that has access to the RDD in question. When starting the app you

Re: Re: application as a service

2014-08-18 Thread Zhanfeng Huo
That helps a lot. Thanks. Zhanfeng Huo From: Davies Liu Date: 2014-08-18 14:31 To: ryaminal CC: u...@spark.incubator.apache.org Subject: Re: application as a service Another option is using Tachyon to cache the RDD, then the cache can be shared by different applications. See how to use

Re: Re: How to pass env variables from master to executors within spark-shell

2014-08-20 Thread Zhanfeng Huo
((key, value) <- sysProps) { System.setProperty(key, value) } Best Regards Zhanfeng Huo From: Akhil Das Date: 2014-08-21 14:36 To: Darin McBeath CC: Spark User Group Subject: Re: How to pass env variables from master to executors within spark-shell One approach would be to set these environmen

Can value in spark-defaults.conf support system variables?

2014-09-01 Thread Zhanfeng Huo
Hi,all: Can value in spark-defaults.conf support system variables? Such as "mess = ${user.home}/${user.name}". Best Regards Zhanfeng Huo

Re: Re: Can value in spark-defaults.conf support system variables?

2014-09-01 Thread Zhanfeng Huo
Thank you. Zhanfeng Huo From: Andrew Or Date: 2014-09-02 08:21 To: Zhanfeng Huo CC: user Subject: Re: Can value in spark-defaults.conf support system variables? No, not currently. 2014-09-01 2:53 GMT-07:00 Zhanfeng Huo : Hi,all: Can value in spark-defaults.conf support system variables

How can I start history-server with kerberos HDFS ?

2014-09-03 Thread Zhanfeng Huo
ion: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : #history-server spark.history.kerberos.enabled true park.history.kerberos.principal test/spark@test spark.history.kerberos.keytab /home/test/test_spark.keytab spark.eventLog.enabled true Zhanfeng Huo

Re: Re: How can I start history-server with kerberos HDFS ?

2014-09-03 Thread Zhanfeng Huo
Thanks for your help. It works after setting SPARK_HISTORY_OPTS. Zhanfeng Huo From: Andrew Or Date: 2014-09-04 07:52 To: Marcelo Vanzin CC: Zhanfeng Huo; user Subject: Re: How can I start history-server with kerberos HDFS ? Hi Zhanfeng, You will need to set these through SPARK_HISTORY_OPTS

spark-1.1.0 with make-distribution.sh problem

2014-09-11 Thread Zhanfeng Huo
0 ++ mvn help:evaluate -Dexpression=hadoop.version -Pyarn -Phive --skip-java-test --with-tachyon --tgz -Pyarn.version=2.3.0 -Phadoop.version=2.3.0 ++ grep -v INFO ++ tail -n 1 + SPARK_HADOOP_VERSION=' -X,--debug Produce execution debug output' Best Regards Zhanfeng Huo

Re: spark-1.1.0 with make-distribution.sh problem

2014-09-11 Thread Zhanfeng Huo
resolved: ./make-distribution.sh --name spark-hadoop-2.3.0 --tgz --with-tachyon -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Phive -DskipTests This code is a bit misleading Zhanfeng Huo From: Zhanfeng Huo Date: 2014-09-12 14:13 To: user Subject: spark-1.1.0 with make-distribution.sh

Re: Re: spark-1.1.0 with make-distribution.sh problem

2014-09-14 Thread Zhanfeng Huo
Thank you very much. It is helpful for end users. Zhanfeng Huo From: Patrick Wendell Date: 2014-09-15 10:19 To: Zhanfeng Huo CC: user Subject: Re: spark-1.1.0 with make-distribution.sh problem Yeah that issue has been fixed by adding better docs, it just didn't make it in time fo

SparkSql OutOfMemoryError

2014-10-28 Thread Zhanfeng Huo
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/10/28 14:42:55 ERROR ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.actor.default-dispatcher-36] shutting down ActorSystem [sparkDriver] java.lang.OutOfMemoryError: Java heap space Zhanfeng Huo

Re: Re: SparkSql OutOfMemoryError

2014-10-28 Thread Zhanfeng Huo
It works, thanks very much Zhanfeng Huo From: Yanbo Liang Date: 2014-10-28 18:50 To: Zhanfeng Huo CC: user Subject: Re: SparkSql OutOfMemoryError Try to increase the driver memory. 2014-10-28 17:33 GMT+08:00 Zhanfeng Huo : Hi,friends: I use spark(spark 1.1) sql operate data in hive-0.12