Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Saisai Shao
Looks like you have to build Spark with related Hadoop version, otherwise
you will meet exception as mentioned. you could follow this doc:
http://spark.apache.org/docs/latest/building-spark.html

2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13







Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Akhil Das
Is your spark compiled against hadoop 2.2? If not download
https://spark.apache.org/downloads.html the Spark 1.2 binary with Hadoop
2.2

Thanks
Best Regards

On Wed, Mar 25, 2015 at 12:52 PM, sandeep vura sandeepv...@gmail.com
wrote:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13







Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

I am running the below command in spark/yarn directory where pom.xml file
is available

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

Please correct me if i am wrong.




On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
wrote:

 Looks like you have to build Spark with related Hadoop version, otherwise
 you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13








Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Akhil Das
Just run :

mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


​

Thanks
Best Regards

On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml file
 is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version, otherwise
 you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13









Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Akhil Das
Oh, in that case you should mention 2.4, If you don't want to compile
spark, then you can download the precompiled version from Downloads page
https://spark.apache.org/downloads.html.
http://d3kbcqa49mib13.cloudfront.net/spark-1.2.0-bin-hadoop2.4.tgz

Thanks
Best Regards

On Wed, Mar 25, 2015 at 5:40 PM, sandeep vura sandeepv...@gmail.com wrote:

 *I am using hadoop 2.4 should i mention -Dhadoop.version=2.2*

 *$ hadoop version*
 *Hadoop 2.4.1*
 *Subversion http://svn.apache.org/repos/asf/hadoop/common
 http://svn.apache.org/repos/asf/hadoop/common -r 1604318*
 *Compiled by jenkins on 2014-06-21T05:43Z*
 *Compiled with protoc 2.5.0*
 *From source with checksum bb7ac0a3c73dc131f4844b873c74b630*
 *This command was run using
 /home/hadoop24/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar*




 On Wed, Mar 25, 2015 at 5:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 -D*hadoop.version=2.2*


 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Build failed with following errors.

 I have executed the below following command.

 * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
 package*


 [INFO]
 
 [INFO] BUILD FAILURE
 [INFO]
 
 [INFO] Total time: 2:11:59.461s
 [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
 [INFO] Final Memory: 30M/440M
 [INFO]
 
 [ERROR] Failed to execute goal on project spark-core_2.10: Could not
 resolve dep
endencies for project
 org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

 artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
 https://repo1.
maven.org/maven2) - [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e swit
ch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please rea
d the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

lutionException
 [ERROR]
 [ERROR] After correcting the problems, you can resume the build with the
 command
 [ERROR]   mvn goals -rf :spark-core_2.10


 On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml
 file is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this 
 doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13













Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Sean Owen
Of course, VERSION is supposed to be replaced by a real Hadoop version!

On Wed, Mar 25, 2015 at 12:04 PM, sandeep vura sandeepv...@gmail.com wrote:
 Build failed with following errors.

 I have executed the below following command.

  mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package


 [INFO]
 
 [INFO] BUILD FAILURE
 [INFO]
 
 [INFO] Total time: 2:11:59.461s
 [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
 [INFO] Final Memory: 30M/440M
 [INFO]
 
 [ERROR] Failed to execute goal on project spark-core_2.10: Could not resolve
 dep
 endencies for project org.apache.spark:spark-core_2.10:jar:1.2.1: Could not
 find
 artifact org.apache.hadoop:hadoop-client:jar:VERSION in central
 (https://repo1.
 maven.org/maven2) - [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
 swit
 ch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions, please
 rea
 d the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso
 lutionException
 [ERROR]
 [ERROR] After correcting the problems, you can resume the build with the
 command
 [ERROR]   mvn goals -rf :spark-core_2.10


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Build failed with following errors.

I have executed the below following command.

* mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
package*


[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 2:11:59.461s
[INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
[INFO] Final Memory: 30M/440M
[INFO]

[ERROR] Failed to execute goal on project spark-core_2.10: Could not
resolve dep
   endencies for project
org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
https://repo1.
   maven.org/maven2) - [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
swit
 ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please rea
   d the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

 lutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn goals -rf :spark-core_2.10


On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml file
 is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13










Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread Akhil Das
-D*hadoop.version=2.2*


Thanks
Best Regards

On Wed, Mar 25, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com wrote:

 Build failed with following errors.

 I have executed the below following command.

 * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
 package*


 [INFO]
 
 [INFO] BUILD FAILURE
 [INFO]
 
 [INFO] Total time: 2:11:59.461s
 [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
 [INFO] Final Memory: 30M/440M
 [INFO]
 
 [ERROR] Failed to execute goal on project spark-core_2.10: Could not
 resolve dep
endencies for project
 org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

 artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
 https://repo1.
  maven.org/maven2) - [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e swit
ch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please rea
d the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

  lutionException
 [ERROR]
 [ERROR] After correcting the problems, you can resume the build with the
 command
 [ERROR]   mvn goals -rf :spark-core_2.10


 On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml
 file is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13











Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
*I am using hadoop 2.4 should i mention -Dhadoop.version=2.2*

*$ hadoop version*
*Hadoop 2.4.1*
*Subversion http://svn.apache.org/repos/asf/hadoop/common
http://svn.apache.org/repos/asf/hadoop/common -r 1604318*
*Compiled by jenkins on 2014-06-21T05:43Z*
*Compiled with protoc 2.5.0*
*From source with checksum bb7ac0a3c73dc131f4844b873c74b630*
*This command was run using
/home/hadoop24/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar*




On Wed, Mar 25, 2015 at 5:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 -D*hadoop.version=2.2*


 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Build failed with following errors.

 I have executed the below following command.

 * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
 package*


 [INFO]
 
 [INFO] BUILD FAILURE
 [INFO]
 
 [INFO] Total time: 2:11:59.461s
 [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
 [INFO] Final Memory: 30M/440M
 [INFO]
 
 [ERROR] Failed to execute goal on project spark-core_2.10: Could not
 resolve dep
endencies for project
 org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find

 artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
 https://repo1.
  maven.org/maven2) - [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e swit
ch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please rea
d the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso

lutionException
 [ERROR]
 [ERROR] After correcting the problems, you can resume the build with the
 command
 [ERROR]   mvn goals -rf :spark-core_2.10


 On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Just run :

 mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package


 ​

 Thanks
 Best Regards

 On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
 wrote:

 Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh

 I am running the below command in spark/yarn directory where pom.xml
 file is available

 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package

 Please correct me if i am wrong.




 On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao sai.sai.s...@gmail.com
 wrote:

 Looks like you have to build Spark with related Hadoop version,
 otherwise you will meet exception as mentioned. you could follow this doc:
 http://spark.apache.org/docs/latest/building-spark.html

 2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv...@gmail.com:

 Hi Sparkers,

 I am trying to load data in spark with the following command

 *sqlContext.sql(LOAD DATA LOCAL INPATH
 '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src);*

 *Getting exception below*


 *Server IPC version 9 cannot communicate with client version 4*

 NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13