Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-04-16 Thread Arpit Tak
I too stuck on same issue , but on shark (0.9 with spark-0.9 ) on
hadoop-2.2.0 .

On rest hadoop versions , it works perfect

Regards,
Arpit Tak


On Wed, Apr 16, 2014 at 11:18 PM, Aureliano Buendia buendia...@gmail.comwrote:

 Is this resolved in spark 0.9.1?


 On Tue, Apr 15, 2014 at 6:55 PM, anant an...@kix.in wrote:

 I've received the same error with Spark built using Maven. It turns out
 that
 mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
 runtime. Protobuf included by Akka is shaded and doesn't cause any
 problems.

 The solution is to update the mesos dependency to 0.18.0 in spark's
 pom.xml.
 Rebuilding the JAR with this configuration solves the issue.

 -Anant



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p4286.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.





Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-04-15 Thread anant
I've received the same error with Spark built using Maven. It turns out that
mesos-0.13.0 depends on protobuf-2.4.1 which is causing the clash at
runtime. Protobuf included by Akka is shaded and doesn't cause any problems.

The solution is to update the mesos dependency to 0.18.0 in spark's pom.xml.
Rebuilding the JAR with this configuration solves the issue.

-Anant



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p4286.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-04-15 Thread giive chen
Hi Prasad

Sorry for missing your reply.
https://gist.github.com/thegiive/10791823
Here it is.

Wisely Chen


On Fri, Apr 4, 2014 at 11:57 PM, Prasad ramachandran.pra...@gmail.comwrote:

 Hi Wisely,
 Could you please post your pom.xml here.

 Thanks



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p3770.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.



Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-04-04 Thread Prasad
Hi Wisely,
Could you please post your pom.xml here.

Thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p3770.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-03-25 Thread Patrick Wendell
Starting with Spark 0.9 the protobuf dependency we use is shaded and
cannot interfere with other protobuf libaries including those in
Hadoop. Not sure what's going on in this case. Would someone who is
having this problem post exactly how they are building spark?

- Patrick

On Fri, Mar 21, 2014 at 3:49 PM, Aureliano Buendia buendia...@gmail.com wrote:



 On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski
 og...@plainvanillagames.com wrote:


 On 3/18/14, 4:49 AM, dmpou...@gmail.com wrote:

 On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:

 Is there a reason for spark using the older akka?




 On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:

 The problem is in akka remote. It contains files compiled with 2.4.*.
 When

 you run it with 2.5.* in classpath it fails like above.



 Looks like moving to akka 2.3 will solve this issue. Check this issue -


 https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:


 Is the solution to exclude the  2.4.*. dependency on protobuf or will
 thi produce more complications?

 I am not sure I remember what the context was around this but I run 0.9.0
 with hadoop 2.2.0 just fine.


 The problem is that spark depends on an older version of akka, which depends
 on an older version of protobuf (2.4).

 This means people cannot use protobuf 2.5 with spark.


 Ognen




Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-03-21 Thread Aureliano Buendia
On Tue, Mar 18, 2014 at 12:56 PM, Ognen Duzlevski 
og...@plainvanillagames.com wrote:


 On 3/18/14, 4:49 AM, dmpou...@gmail.com wrote:

 On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:

 Is there a reason for spark using the older akka?




 On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:

 The problem is in akka remote. It contains files compiled with 2.4.*.
 When

 you run it with 2.5.* in classpath it fails like above.



 Looks like moving to akka 2.3 will solve this issue. Check this issue -

 https://www.assembla.com/spaces/akka/tickets/3154-use-
 protobuf-version-2-5-0#/activity/ticket:


 Is the solution to exclude the  2.4.*. dependency on protobuf or will
 thi produce more complications?

 I am not sure I remember what the context was around this but I run 0.9.0
 with hadoop 2.2.0 just fine.


The problem is that spark depends on an older version of akka, which
depends on an older version of protobuf (2.4).

This means people cannot use protobuf 2.5 with spark.


 Ognen



Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-03-18 Thread dmpour23
On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:
 Is there a reason for spark using the older akka?
 
 
 
 
 On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:
 
 The problem is in akka remote. It contains files compiled with 2.4.*. When
 
 you run it with 2.5.* in classpath it fails like above.
 
 
 
 Looks like moving to akka 2.3 will solve this issue. Check this issue -
 
 https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:
 
 
 
 
 
 
 
 
 --
 
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158p2217.html
 
 
 
 
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

Is the solution to exclude the  2.4.*. dependency on protobuf or will thi 
produce more complications?

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-03-18 Thread Ognen Duzlevski


On 3/18/14, 4:49 AM, dmpou...@gmail.com wrote:

On Sunday, 2 March 2014 19:19:49 UTC+2, Aureliano Buendia  wrote:

Is there a reason for spark using the older akka?




On Sun, Mar 2, 2014 at 1:53 PM, 1esha alexey.r...@gmail.com wrote:

The problem is in akka remote. It contains files compiled with 2.4.*. When

you run it with 2.5.* in classpath it fails like above.



Looks like moving to akka 2.3 will solve this issue. Check this issue -

https://www.assembla.com/spaces/akka/tickets/3154-use-protobuf-version-2-5-0#/activity/ticket:


Is the solution to exclude the  2.4.*. dependency on protobuf or will thi 
produce more complications?
I am not sure I remember what the context was around this but I run 
0.9.0 with hadoop 2.2.0 just fine.

Ognen


Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-02-28 Thread Egor Pahomov
Spark 0.9 uses protobuf 2.5.0
Hadoop 2.2 uses protobuf 2.5.0
protobuf 2.5.0 can read massages serialized with protobuf 2.4.1
So there is not any reason why you can't read some messages from hadoop 2.2
with protobuf 2.5.0, probably you somehow have 2.4.1 in your class path. Of
course it's very bad, that you have both 2.4.1 and 2.5.0 in your classpath.
Use excludes or whatever to get rid of 2.4.1.

Personally, I spend 3 days to move my project to protobuf 2.5.0 from 2.4.1.
But it has to be done for the whole your project.

2014-02-28 21:49 GMT+04:00 Aureliano Buendia buendia...@gmail.com:

 Doesn't hadoop 2.2 also depend on protobuf 2.4?


 On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski 
 og...@plainvanillagames.com wrote:

 A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
 support?

 Ognen

 On 2/28/14, 10:51 AM, Prasad wrote:

 Hi
 I am getting the protobuf error while reading HDFS file using spark
 0.9.0 -- i am running on hadoop 2.2.0 .

 When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
 suggest that there is some incompatability issues betwen 2.4.1 and 2.5

 hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
 protobuf-java*.jar
 /home/hduser/.m2/repository/com/google/protobuf/protobuf-
 java/2.4.1/protobuf-java-2.4.1.jar
 /home/hduser/.m2/repository/org/spark-project/protobuf/
 protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
 /home/hduser/spark-0.9.0-incubating/lib_managed/
 bundles/protobuf-java-2.5.0.jar
 /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
 protobuf-java-2.4.1-shaded.jar
 /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
 bundles/protobuf-java-2.5.0.jar
 /home/hduser/.ivy2/cache/org.spark-project.protobuf/
 protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


 Can someone please let me know if you faced these issues and how u fixed
 it.

 Thanks
 Prasad.
 Caused by: java.lang.VerifyError: class
 org.apache.hadoop.security.proto.SecurityProtos$
 GetDelegationTokenRequestProto
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
  at
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
  at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
  at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
  at java.lang.Class.getDeclaredMethods0(Native Method)
  at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
  at java.lang.Class.privateGetPublicMethods(Class.java:2651)
  at java.lang.Class.privateGetPublicMethods(Class.java:2661)
  at java.lang.Class.getMethods(Class.java:1467)
  at
 sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
  at
 sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
  at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
  at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
  at
 org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
 ProtobufRpcEngine.java:92)
  at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


 Caused by: java.lang.reflect.InvocationTargetException
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at
 sun.reflect.NativeMethodAccessorImpl.invoke(
 NativeMethodAccessorImpl.java:57)
  at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(
 DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)










 --
 View this message in context: http://apache-spark-user-list.
 1001560.n3.nabble.com/Error-reading-HDFS-file-using-spark-
 0-9-0-hadoop-2-2-0-incompatible-protobuf-2-5-and-2-4-1-tp2158.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.


 --
 Some people, when confronted with a problem, think I know, I'll use
 regular expressions. Now they have two problems.
 -- Jamie Zawinski





-- 



*Sincerely yoursEgor PakhomovScala Developer, Yandex*


Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-02-28 Thread Egor Pahomov
In that same pom

profile
  idyarn/id
  properties
hadoop.major.version2/hadoop.major.version
hadoop.version2.2.0/hadoop.version
protobuf.version2.5.0/protobuf.version
  /properties
  modules
moduleyarn/module
  /modules

/profile



2014-02-28 23:46 GMT+04:00 Aureliano Buendia buendia...@gmail.com:




 On Fri, Feb 28, 2014 at 7:17 PM, Egor Pahomov pahomov.e...@gmail.comwrote:

 Spark 0.9 uses protobuf 2.5.0


 Spark 0.9 uses 2.4.1:


 https://github.com/apache/incubator-spark/blob/4d880304867b55a4f2138617b30600b7fa013b14/pom.xml#L118

 Is there another pom for when hadoop 2.2 is used? I don't see another
 branch for hadooop 2.2.


 Hadoop 2.2 uses protobuf 2.5.0
 protobuf 2.5.0 can read massages serialized with protobuf 2.4.1


 Protobuf java code generated by ptotoc 2.4 does not compile with protobuf
 library 2.5. This is what the OP's error message is about.


 So there is not any reason why you can't read some messages from hadoop
 2.2 with protobuf 2.5.0, probably you somehow have 2.4.1 in your class
 path. Of course it's very bad, that you have both 2.4.1 and 2.5.0 in your
 classpath. Use excludes or whatever to get rid of 2.4.1.

 Personally, I spend 3 days to move my project to protobuf 2.5.0 from
 2.4.1. But it has to be done for the whole your project.

 2014-02-28 21:49 GMT+04:00 Aureliano Buendia buendia...@gmail.com:

 Doesn't hadoop 2.2 also depend on protobuf 2.4?


 On Fri, Feb 28, 2014 at 5:45 PM, Ognen Duzlevski 
 og...@plainvanillagames.com wrote:

 A stupid question, by the way, you did compile Spark with Hadoop 2.2.0
 support?

 Ognen

 On 2/28/14, 10:51 AM, Prasad wrote:

 Hi
 I am getting the protobuf error while reading HDFS file using spark
 0.9.0 -- i am running on hadoop 2.2.0 .

 When i look thru, i find that i have both 2.4.1 and 2.5 and some blogs
 suggest that there is some incompatability issues betwen 2.4.1 and 2.5

 hduser@prasadHdp1:~/spark-0.9.0-incubating$ find ~/ -name
 protobuf-java*.jar
 /home/hduser/.m2/repository/com/google/protobuf/protobuf-
 java/2.4.1/protobuf-java-2.4.1.jar
 /home/hduser/.m2/repository/org/spark-project/protobuf/
 protobuf-java/2.4.1-shaded/protobuf-java-2.4.1-shaded.jar
 /home/hduser/spark-0.9.0-incubating/lib_managed/
 bundles/protobuf-java-2.5.0.jar
 /home/hduser/spark-0.9.0-incubating/lib_managed/jars/
 protobuf-java-2.4.1-shaded.jar
 /home/hduser/.ivy2/cache/com.google.protobuf/protobuf-java/
 bundles/protobuf-java-2.5.0.jar
 /home/hduser/.ivy2/cache/org.spark-project.protobuf/
 protobuf-java/jars/protobuf-java-2.4.1-shaded.jar


 Can someone please let me know if you faced these issues and how u
 fixed it.

 Thanks
 Prasad.
 Caused by: java.lang.VerifyError: class
 org.apache.hadoop.security.proto.SecurityProtos$
 GetDelegationTokenRequestProto
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
  at
 java.security.SecureClassLoader.defineClass(
 SecureClassLoader.java:142)
  at java.net.URLClassLoader.defineClass(URLClassLoader.
 java:449)
  at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
  at java.lang.Class.getDeclaredMethods0(Native Method)
  at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
  at java.lang.Class.privateGetPublicMethods(Class.java:2651)
  at java.lang.Class.privateGetPublicMethods(Class.java:2661)
  at java.lang.Class.getMethods(Class.java:1467)
  at
 sun.misc.ProxyGenerator.generateClassFile(ProxyGenerator.java:426)
  at
 sun.misc.ProxyGenerator.generateProxyClass(ProxyGenerator.java:323)
  at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:636)
  at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:722)
  at
 org.apache.hadoop.ipc.ProtobufRpcEngine.getProxy(
 ProtobufRpcEngine.java:92)
  at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)


 Caused by: java.lang.reflect.InvocationTargetException
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
 Method)
  at
 sun.reflect.NativeMethodAccessorImpl.invoke(
 NativeMethodAccessorImpl.java:57)
  at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(
 DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)










 --
 View this message in context: http://apache-spark-user-list.