-D*hadoop.version=2.2*

Thanks
Best Regards

On Wed, Mar 25, 2015 at 5:34 PM, sandeep vura <sandeepv...@gmail.com> wrote:

> Build failed with following errors.
>
> I have executed the below following command.
>
> * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
> package*
>
>
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 2:11:59.461s
> [INFO] Finished at: Wed Mar 25 17:22:29 IST 2015
> [INFO] Final Memory: 30M/440M
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal on project spark-core_2.10: Could not
> resolve dep
>                        endencies for project
> org.apache.spark:spark-core_2.10:jar:1.2.1: Could not find
>
> artifact org.apache.hadoop:hadoop-client:jar:VERSION in central (
> https://repo1.
>                              maven.org/maven2) -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e swit
>                    ch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please rea
>                        d the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyReso
>
>  lutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn <goals> -rf :spark-core_2.10
>
>
> On Wed, Mar 25, 2015 at 3:38 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Just run :
>>
>> mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package
>>
>>
>> ​
>>
>> Thanks
>> Best Regards
>>
>> On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura <sandeepv...@gmail.com>
>> wrote:
>>
>>> Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh
>>>
>>> I am running the below command in spark/yarn directory where pom.xml
>>> file is available
>>>
>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package
>>>
>>> Please correct me if i am wrong.
>>>
>>>
>>>
>>>
>>> On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao <sai.sai.s...@gmail.com>
>>> wrote:
>>>
>>>> Looks like you have to build Spark with related Hadoop version,
>>>> otherwise you will meet exception as mentioned. you could follow this doc:
>>>> http://spark.apache.org/docs/latest/building-spark.html
>>>>
>>>> 2015-03-25 15:22 GMT+08:00 sandeep vura <sandeepv...@gmail.com>:
>>>>
>>>>> Hi Sparkers,
>>>>>
>>>>> I am trying to load data in spark with the following command
>>>>>
>>>>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>>>>> '/home/spark12/sandeep/sandeep.txt   ' INTO TABLE src");*
>>>>>
>>>>> *Getting exception below*
>>>>>
>>>>>
>>>>> *Server IPC version 9 cannot communicate with client version 4*
>>>>>
>>>>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to