Unable to start Pi (hello world) application on Spark 1.4

2015-06-26 Thread ๏̯͡๏
It used to work with 1.3.1, however with 1.4.0 i get the following exception


export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
export
SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
export HADOOP_CONF_DIR=/apache/hadoop/conf
cd $SPARK_HOME
./bin/spark-submit -v --master yarn-cluster --driver-class-path
/apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/yarn/lib/guava-11.0.2.jar
--jars
/apache/hadoop/lib/hadoop-lzo-0.6.0.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar
--num-executors 1 --driver-memory 4g --driver-java-options
"-XX:MaxPermSize=2G" --executor-memory 2g --executor-cores 1 --queue
hdmi-express --class org.apache.spark.examples.SparkPi
./lib/spark-examples*.jar 10

*Exception*

15/06/26 14:24:42 INFO client.ConfiguredRMFailoverProxyProvider: Failing
over to rm2

15/06/26 14:24:42 WARN ipc.Client: Exception encountered while connecting
to the server : java.lang.IllegalArgumentException: Server has invalid
Kerberos principal: hadoop/x-y-rm-2.vip.cm@corp.cm.com


I remember getting this error when working Spark 1.2.x where in the way i
used to get

*/apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar*

this library into cp. With 1.3.1 using --driver-class-path gets it running
but with 1.4 it does not work

Please suggest.

-- 
Deepak


Re: Unable to start Pi (hello world) application on Spark 1.4

2015-06-28 Thread ๏̯͡๏
Any thoughts on this ?

On Fri, Jun 26, 2015 at 2:27 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:

> It used to work with 1.3.1, however with 1.4.0 i get the following
> exception
>
>
> export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
> export
> SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
> export HADOOP_CONF_DIR=/apache/hadoop/conf
> cd $SPARK_HOME
> ./bin/spark-submit -v --master yarn-cluster --driver-class-path
> /apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/yarn/lib/guava-11.0.2.jar
> --jars
> /apache/hadoop/lib/hadoop-lzo-0.6.0.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar
> --num-executors 1 --driver-memory 4g --driver-java-options
> "-XX:MaxPermSize=2G" --executor-memory 2g --executor-cores 1 --queue
> hdmi-express --class org.apache.spark.examples.SparkPi
> ./lib/spark-examples*.jar 10
>
> *Exception*
>
> 15/06/26 14:24:42 INFO client.ConfiguredRMFailoverProxyProvider: Failing
> over to rm2
>
> 15/06/26 14:24:42 WARN ipc.Client: Exception encountered while connecting
> to the server : java.lang.IllegalArgumentException: Server has invalid
> Kerberos principal: hadoop/x-y-rm-2.vip.cm@corp.cm.com
>
>
> I remember getting this error when working Spark 1.2.x where in the way i
> used to get
>
> */apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar*
>
> this library into cp. With 1.3.1 using --driver-class-path gets it running
> but with 1.4 it does not work
>
> Please suggest.
>
> --
> Deepak
>
>


-- 
Deepak


Re: Unable to start Pi (hello world) application on Spark 1.4

2015-06-28 Thread ๏̯͡๏
Figured it out.

All the jars that are specified with driver-class-path are now exported
through SPARK_CLASSPATH and its working now.

I thought SPARK_CLASSPATH was dead. Looks like its flipping ON/OFF

On Sun, Jun 28, 2015 at 12:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏)  wrote:

> Any thoughts on this ?
>
> On Fri, Jun 26, 2015 at 2:27 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) 
> wrote:
>
>> It used to work with 1.3.1, however with 1.4.0 i get the following
>> exception
>>
>>
>> export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
>> export
>> SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
>> export HADOOP_CONF_DIR=/apache/hadoop/conf
>> cd $SPARK_HOME
>> ./bin/spark-submit -v --master yarn-cluster --driver-class-path
>> /apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/yarn/lib/guava-11.0.2.jar
>> --jars
>> /apache/hadoop/lib/hadoop-lzo-0.6.0.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar
>> --num-executors 1 --driver-memory 4g --driver-java-options
>> "-XX:MaxPermSize=2G" --executor-memory 2g --executor-cores 1 --queue
>> hdmi-express --class org.apache.spark.examples.SparkPi
>> ./lib/spark-examples*.jar 10
>>
>> *Exception*
>>
>> 15/06/26 14:24:42 INFO client.ConfiguredRMFailoverProxyProvider: Failing
>> over to rm2
>>
>> 15/06/26 14:24:42 WARN ipc.Client: Exception encountered while connecting
>> to the server : java.lang.IllegalArgumentException: Server has invalid
>> Kerberos principal: hadoop/x-y-rm-2.vip.cm@corp.cm.com
>>
>>
>> I remember getting this error when working Spark 1.2.x where in the way i
>> used to get
>>
>> */apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar*
>>
>> this library into cp. With 1.3.1 using --driver-class-path gets it
>> running but with 1.4 it does not work
>>
>> Please suggest.
>>
>> --
>> Deepak
>>
>>
>
>
> --
> Deepak
>
>


-- 
Deepak