Any pointers to this issue..?


Thanks & Regards
Brahma Reddy Battula




________________________________________
From: Brahma Reddy Battula [brahmareddy.batt...@huawei.com]
Sent: Monday, April 20, 2015 9:30 AM
To: Sean Owen; Ted Yu
Cc: user@spark.apache.org
Subject: RE: compliation error

Thanks a lot for your replies..

@Ted,V100R001C00 this is our internal hadoop version which is based on hadoop 
2.4.1..

@Sean Owen,Yes, you are correct...Just I wanted to know, what leads this 
problem...


Thanks & Regards
Brahma Reddy Battula
________________________________________
From: Sean Owen [so...@cloudera.com]
Sent: Monday, April 20, 2015 9:14 AM
To: Ted Yu
Cc: Brahma Reddy Battula; user@spark.apache.org
Subject: Re: compliation error

Brahma since you can see the continuous integration builds are
passing, it's got to be something specific to your environment, right?
this is not even an error from Spark, but from Maven plugins.

On Mon, Apr 20, 2015 at 4:42 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> bq. -Dhadoop.version=V100R001C00
>
> First time I saw above hadoop version. Doesn't look like Apache release.
>
> I checked my local maven repo but didn't find impl under
> ~/.m2/repository/com/ibm/icu
>
> FYI
>
> On Sun, Apr 19, 2015 at 8:04 PM, Brahma Reddy Battula
> <brahmareddy.batt...@huawei.com> wrote:
>>
>> Hey Todd
>>
>> Thanks a lot for your reply...Kindly check following details..
>>
>> spark version :1.1.0
>> jdk:jdk1.7.0_60 ,
>> command:mvn -Pbigtop-dist  -Phive -Pyarn -Phadoop-2.4
>> -Dhadoop.version=V100R001C00 -DskipTests package
>>
>>
>> Thanks & Regards
>>
>>
>>
>> Brahma Reddy Battula
>>
>>
>> ________________________________
>> From: Ted Yu [yuzhih...@gmail.com]
>> Sent: Monday, April 20, 2015 8:07 AM
>> To: Brahma Reddy Battula
>> Cc: user@spark.apache.org
>> Subject: Re: compliation error
>>
>> What JDK release are you using ?
>>
>> Can you give the complete command you used ?
>>
>> Which Spark branch are you working with ?
>>
>> Cheers
>>
>> On Sun, Apr 19, 2015 at 7:25 PM, Brahma Reddy Battula
>> <brahmareddy.batt...@huawei.com> wrote:
>>>
>>> Hi All
>>>
>>> Getting following error, when I am compiling spark..What did I miss..?
>>> Even googled and did not find the exact solution for this...
>>>
>>>
>>> [ERROR] Failed to execute goal
>>> org.apache.maven.plugins:maven-shade-plugin:2.2:shade (default) on project
>>> spark-assembly_2.10: Error creating shaded jar: Error in ASM processing
>>> class com/ibm/icu/impl/data/LocaleElements_zh__PINYIN.class: 48188 -> [Help
>>> 1]
>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
>>> goal org.apache.maven.plugins:maven-shade-plugin:2.2:shade (default) on
>>> project spark-assembly_2.10: Error creating shaded jar: Error in ASM
>>> processing class com/ibm/icu/impl/data/LocaleElements_zh__PINYIN.class
>>>         at
>>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
>>>         at
>>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
>>>         at
>>> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
>>>         at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
>>>         at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
>>>         at
>>> org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
>>>         at
>>> org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
>>>         at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
>>>
>>>
>>>
>>> Thanks & Regards
>>>
>>> Brahma Reddy Battula
>>>
>>>
>>>
>>>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to