Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-10 Thread Patrick Wendell
Okay so I think the issue here is just a conflict between your application
code and the Hadoop code.

Hadoop 2.0.0 depends on protobuf 2.4.0a:
https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.0-alpha/hadoop-project/pom.xml

Your code is depending on protobuf 2.5.X

The protobuf library is not binary compatible between these two versions
(unfortunately). This means that your application will have to shade
protobuf 2.5.X or you will have to upgrade to a version of Hadoop that is
compatible.


On Wed, Apr 9, 2014 at 1:03 PM, Kanwaldeep  wrote:

> Any update on this? We are still facing this issue.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p4015.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-09 Thread Kanwaldeep
Any update on this? We are still facing this issue.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p4015.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-03 Thread Vipul Pandey
Any word on this one ?
On Apr 2, 2014, at 12:26 AM, Vipul Pandey  wrote:

> I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus 
> generated also has both shaded and real version of protobuf classes
> 
> Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv 
> ./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>  | grep proto | grep /Message
>   1190 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageOrBuilder.class
>   2913 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/Message$Builder.class
>704 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLite.class
>   1904 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLite$Builder.class
>257 Wed Apr 02 00:19:56 PDT 2014 
> com/google/protobuf_spark/MessageLiteOrBuilder.class
>508 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message.class
>   2661 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message$Builder.class
>478 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message.class
>   1748 Wed Apr 02 00:20:00 PDT 2014 
> com/google/protobuf/MessageLite$Builder.class
>668 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite.class
>245 Wed Apr 02 00:20:00 PDT 2014 
> com/google/protobuf/MessageLiteOrBuilder.class
>   1112 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageOrBuilder.class
> 
> 
> 
> 
> 
> On Apr 1, 2014, at 11:44 PM, Patrick Wendell  wrote:
> 
>> It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
>> 
>> 
>> On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey  wrote:
>> how do you recommend building that - it says 
>> ERROR] Failed to execute goal 
>> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly 
>> (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: 
>> No assembly descriptors found. -> [Help 1]
>> upon runnning 
>> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>> 
>> 
>> On Apr 1, 2014, at 4:13 PM, Patrick Wendell  wrote:
>> 
>>> Do you get the same problem if you build with maven?
>>> 
>>> 
>>> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey  wrote:
>>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly 
>>> 
>>> That's all I do. 
>>> 
>>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:
>>> 
 Vidal - could you show exactly what flags/commands you are using when you 
 build spark to produce this assembly?
 
 
 On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't 
> be getting pulled in unless you are directly using akka yourself. Are you?
 
 No i'm not. Although I see that protobuf libraries are directly pulled 
 into the 0.9.0 assembly jar - I do see the shaded version as well. 
 e.g. below for Message.class
 
 -bash-4.1$ jar -ftv 
 ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
  | grep protobuf | grep /Message.class
478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
 
 
> Does your project have other dependencies that might be indirectly 
> pulling in protobuf 2.4.1? It would be helpful if you could list all of 
> your dependencies including the exact Spark version and other libraries.
 
 I did have another one which I moved to the end of classpath - even ran 
 partial code without that dependency but it still failed whenever I use 
 the jar with ScalaBuf dependency. 
 Spark version is 0.9.0
 
 
 ~Vipul
 
 On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
 
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't 
> be getting pulled in unless you are directly using akka yourself. Are you?
> 
> Does your project have other dependencies that might be indirectly 
> pulling in protobuf 2.4.1? It would be helpful if you could list all of 
> your dependencies including the exact Spark version and other libraries.
> 
> - Patrick
> 
> 
> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same 
> issue. any word on this one?
> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
> 
> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 
> > 0.9 with
> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar 
> > deployed
> > on each of the spark worker nodes.
> > The message is compiled using 2.5 but then on runtime it is being
> > de-serialized by 2.4.1 as I'm getting the following exception
> >
> > java.lang.VerifyError (java.lang.VerifyError: class
> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
> > getUnknownF

Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-02 Thread Vipul Pandey
I downloaded 0.9.0 fresh and ran the mvn command - the assembly jar thus 
generated also has both shaded and real version of protobuf classes

Vipuls-MacBook-Pro-3:spark-0.9.0-incubating vipul$ jar -ftv 
./assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
 | grep proto | grep /Message
  1190 Wed Apr 02 00:19:56 PDT 2014 
com/google/protobuf_spark/MessageOrBuilder.class
  2913 Wed Apr 02 00:19:56 PDT 2014 
com/google/protobuf_spark/Message$Builder.class
   704 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/MessageLite.class
  1904 Wed Apr 02 00:19:56 PDT 2014 
com/google/protobuf_spark/MessageLite$Builder.class
   257 Wed Apr 02 00:19:56 PDT 2014 
com/google/protobuf_spark/MessageLiteOrBuilder.class
   508 Wed Apr 02 00:19:56 PDT 2014 com/google/protobuf_spark/Message.class
  2661 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message$Builder.class
   478 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/Message.class
  1748 Wed Apr 02 00:20:00 PDT 2014 
com/google/protobuf/MessageLite$Builder.class
   668 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageLite.class
   245 Wed Apr 02 00:20:00 PDT 2014 
com/google/protobuf/MessageLiteOrBuilder.class
  1112 Wed Apr 02 00:20:00 PDT 2014 com/google/protobuf/MessageOrBuilder.class





On Apr 1, 2014, at 11:44 PM, Patrick Wendell  wrote:

> It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package
> 
> 
> On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey  wrote:
> how do you recommend building that - it says 
> ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly 
> (default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No 
> assembly descriptors found. -> [Help 1]
> upon runnning 
> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
> 
> 
> On Apr 1, 2014, at 4:13 PM, Patrick Wendell  wrote:
> 
>> Do you get the same problem if you build with maven?
>> 
>> 
>> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey  wrote:
>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly 
>> 
>> That's all I do. 
>> 
>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:
>> 
>>> Vidal - could you show exactly what flags/commands you are using when you 
>>> build spark to produce this assembly?
>>> 
>>> 
>>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
 Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
 getting pulled in unless you are directly using akka yourself. Are you?
>>> 
>>> No i'm not. Although I see that protobuf libraries are directly pulled into 
>>> the 0.9.0 assembly jar - I do see the shaded version as well. 
>>> e.g. below for Message.class
>>> 
>>> -bash-4.1$ jar -ftv 
>>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>>>  | grep protobuf | grep /Message.class
>>>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>>508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>>> 
>>> 
 Does your project have other dependencies that might be indirectly pulling 
 in protobuf 2.4.1? It would be helpful if you could list all of your 
 dependencies including the exact Spark version and other libraries.
>>> 
>>> I did have another one which I moved to the end of classpath - even ran 
>>> partial code without that dependency but it still failed whenever I use the 
>>> jar with ScalaBuf dependency. 
>>> Spark version is 0.9.0
>>> 
>>> 
>>> ~Vipul
>>> 
>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
>>> 
 Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
 getting pulled in unless you are directly using akka yourself. Are you?
 
 Does your project have other dependencies that might be indirectly pulling 
 in protobuf 2.4.1? It would be helpful if you could list all of your 
 dependencies including the exact Spark version and other libraries.
 
 - Patrick
 
 
 On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
 I'm using ScalaBuff (which depends on protobuf2.5) and facing the same 
 issue. any word on this one?
 On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
 
 > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 
 > with
 > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar 
 > deployed
 > on each of the spark worker nodes.
 > The message is compiled using 2.5 but then on runtime it is being
 > de-serialized by 2.4.1 as I'm getting the following exception
 >
 > java.lang.VerifyError (java.lang.VerifyError: class
 > com.snc.sinet.messages.XServerMessage$XServer overrides final method
 > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
 > java.lang.ClassLoader.defineClass1(Native Method)
 > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
 > java.lang.ClassLoader.defineClass(ClassLoader

Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Patrick Wendell
It's this: mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean package


On Tue, Apr 1, 2014 at 5:15 PM, Vipul Pandey  wrote:

> how do you recommend building that - it says
> ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly
> (default-cli) on project spark-0.9.0-incubating: Error reading assemblies:
> No assembly descriptors found. -> [Help 1]
> upon runnning
> mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly
>
>
> On Apr 1, 2014, at 4:13 PM, Patrick Wendell  wrote:
>
> Do you get the same problem if you build with maven?
>
>
> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey  wrote:
>
>> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>>
>> That's all I do.
>>
>> On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:
>>
>> Vidal - could you show exactly what flags/commands you are using when you
>> build spark to produce this assembly?
>>
>>
>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
>>
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> No i'm not. Although I see that protobuf libraries are directly pulled
>>> into the 0.9.0 assembly jar - I do see the shaded version as well.
>>> e.g. below for Message.class
>>>
>>> -bash-4.1$ jar -ftv
>>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>>> | grep protobuf | grep /Message.class
>>>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>>508 Sat Dec 14 14:20:38 PST 2013
>>> com/google/protobuf_spark/Message.class
>>>
>>>
>>> Does your project have other dependencies that might be indirectly
>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>>> your dependencies including the exact Spark version and other libraries.
>>>
>>> I did have another one which I moved to the end of classpath - even ran
>>> partial code without that dependency but it still failed whenever I use the
>>> jar with ScalaBuf dependency.
>>> Spark version is 0.9.0
>>>
>>>
>>> ~Vipul
>>>
>>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
>>>
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>>> be getting pulled in unless you are directly using akka yourself. Are you?
>>>
>>> Does your project have other dependencies that might be indirectly
>>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>>> your dependencies including the exact Spark version and other libraries.
>>>
>>> - Patrick
>>>
>>>
>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey wrote:
>>>
 I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
 issue. any word on this one?
 On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:

 > We are using Protocol Buffer 2.5 to send messages to Spark Streaming
 0.9 with
 > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
 deployed
 > on each of the spark worker nodes.
 > The message is compiled using 2.5 but then on runtime it is being
 > de-serialized by 2.4.1 as I'm getting the following exception
 >
 > java.lang.VerifyError (java.lang.VerifyError: class
 > com.snc.sinet.messages.XServerMessage$XServer overrides final method
 > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
 > java.lang.ClassLoader.defineClass1(Native Method)
 > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
 > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
 >
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
 >
 > Suggestions on how I could still use ProtoBuf 2.5. Based on the
 article -
 > https://spark-project.atlassian.net/browse/SPARK-995 we should be
 able to
 > use different version of protobuf in the application.
 >
 >
 >
 >
 >
 > --
 > View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
 > Sent from the Apache Spark User List mailing list archive at
 Nabble.com .


>>>
>>>
>>
>>
>
>


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Vipul Pandey
how do you recommend building that - it says 
ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-5:assembly 
(default-cli) on project spark-0.9.0-incubating: Error reading assemblies: No 
assembly descriptors found. -> [Help 1]
upon runnning 
mvn -Dhadoop.version=2.0.0-cdh4.2.1 -DskipTests clean assembly:assembly


On Apr 1, 2014, at 4:13 PM, Patrick Wendell  wrote:

> Do you get the same problem if you build with maven?
> 
> 
> On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey  wrote:
> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly 
> 
> That's all I do. 
> 
> On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:
> 
>> Vidal - could you show exactly what flags/commands you are using when you 
>> build spark to produce this assembly?
>> 
>> 
>> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
>>> getting pulled in unless you are directly using akka yourself. Are you?
>> 
>> No i'm not. Although I see that protobuf libraries are directly pulled into 
>> the 0.9.0 assembly jar - I do see the shaded version as well. 
>> e.g. below for Message.class
>> 
>> -bash-4.1$ jar -ftv 
>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>>  | grep protobuf | grep /Message.class
>>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>> 
>> 
>>> Does your project have other dependencies that might be indirectly pulling 
>>> in protobuf 2.4.1? It would be helpful if you could list all of your 
>>> dependencies including the exact Spark version and other libraries.
>> 
>> I did have another one which I moved to the end of classpath - even ran 
>> partial code without that dependency but it still failed whenever I use the 
>> jar with ScalaBuf dependency. 
>> Spark version is 0.9.0
>> 
>> 
>> ~Vipul
>> 
>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
>> 
>>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
>>> getting pulled in unless you are directly using akka yourself. Are you?
>>> 
>>> Does your project have other dependencies that might be indirectly pulling 
>>> in protobuf 2.4.1? It would be helpful if you could list all of your 
>>> dependencies including the exact Spark version and other libraries.
>>> 
>>> - Patrick
>>> 
>>> 
>>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same 
>>> issue. any word on this one?
>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
>>> 
>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 
>>> > with
>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar 
>>> > deployed
>>> > on each of the spark worker nodes.
>>> > The message is compiled using 2.5 but then on runtime it is being
>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>> >
>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> >
>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>>> > use different version of protobuf in the application.
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context: 
>>> > http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>> 
>>> 
>> 
>> 
> 
> 



Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Patrick Wendell
Do you get the same problem if you build with maven?


On Tue, Apr 1, 2014 at 12:23 PM, Vipul Pandey  wrote:

> SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly
>
> That's all I do.
>
> On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:
>
> Vidal - could you show exactly what flags/commands you are using when you
> build spark to produce this assembly?
>
>
> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
>
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>> be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> No i'm not. Although I see that protobuf libraries are directly pulled
>> into the 0.9.0 assembly jar - I do see the shaded version as well.
>> e.g. below for Message.class
>>
>> -bash-4.1$ jar -ftv
>> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>> | grep protobuf | grep /Message.class
>>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>>508 Sat Dec 14 14:20:38 PST 2013
>> com/google/protobuf_spark/Message.class
>>
>>
>> Does your project have other dependencies that might be indirectly
>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>> your dependencies including the exact Spark version and other libraries.
>>
>> I did have another one which I moved to the end of classpath - even ran
>> partial code without that dependency but it still failed whenever I use the
>> jar with ScalaBuf dependency.
>> Spark version is 0.9.0
>>
>>
>> ~Vipul
>>
>> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
>>
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't
>> be getting pulled in unless you are directly using akka yourself. Are you?
>>
>> Does your project have other dependencies that might be indirectly
>> pulling in protobuf 2.4.1? It would be helpful if you could list all of
>> your dependencies including the exact Spark version and other libraries.
>>
>> - Patrick
>>
>>
>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey wrote:
>>
>>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
>>> issue. any word on this one?
>>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
>>>
>>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming
>>> 0.9 with
>>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
>>> deployed
>>> > on each of the spark worker nodes.
>>> > The message is compiled using 2.5 but then on runtime it is being
>>> > de-serialized by 2.4.1 as I'm getting the following exception
>>> >
>>> > java.lang.VerifyError (java.lang.VerifyError: class
>>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>>> > java.lang.ClassLoader.defineClass1(Native Method)
>>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> >
>>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the
>>> article -
>>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be
>>> able to
>>> > use different version of protobuf in the application.
>>> >
>>> >
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>>> > Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com .
>>>
>>>
>>
>>
>
>


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Vipul Pandey
SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 sbt/sbt assembly 

That's all I do. 

On Apr 1, 2014, at 11:41 AM, Patrick Wendell  wrote:

> Vidal - could you show exactly what flags/commands you are using when you 
> build spark to produce this assembly?
> 
> 
> On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
>> getting pulled in unless you are directly using akka yourself. Are you?
> 
> No i'm not. Although I see that protobuf libraries are directly pulled into 
> the 0.9.0 assembly jar - I do see the shaded version as well. 
> e.g. below for Message.class
> 
> -bash-4.1$ jar -ftv 
> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>  | grep protobuf | grep /Message.class
>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
> 
> 
>> Does your project have other dependencies that might be indirectly pulling 
>> in protobuf 2.4.1? It would be helpful if you could list all of your 
>> dependencies including the exact Spark version and other libraries.
> 
> I did have another one which I moved to the end of classpath - even ran 
> partial code without that dependency but it still failed whenever I use the 
> jar with ScalaBuf dependency. 
> Spark version is 0.9.0
> 
> 
> ~Vipul
> 
> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
> 
>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
>> getting pulled in unless you are directly using akka yourself. Are you?
>> 
>> Does your project have other dependencies that might be indirectly pulling 
>> in protobuf 2.4.1? It would be helpful if you could list all of your 
>> dependencies including the exact Spark version and other libraries.
>> 
>> - Patrick
>> 
>> 
>> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same 
>> issue. any word on this one?
>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
>> 
>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 
>> > with
>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar 
>> > deployed
>> > on each of the spark worker nodes.
>> > The message is compiled using 2.5 but then on runtime it is being
>> > de-serialized by 2.4.1 as I'm getting the following exception
>> >
>> > java.lang.VerifyError (java.lang.VerifyError: class
>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>> > java.lang.ClassLoader.defineClass1(Native Method)
>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> >
>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
>> > use different version of protobuf in the application.
>> >
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context: 
>> > http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> 
>> 
> 
> 



Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Patrick Wendell
Vidal - could you show exactly what flags/commands you are using when you
build spark to produce this assembly?


On Tue, Apr 1, 2014 at 12:53 AM, Vipul Pandey  wrote:

> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
> getting pulled in unless you are directly using akka yourself. Are you?
>
> No i'm not. Although I see that protobuf libraries are directly pulled
> into the 0.9.0 assembly jar - I do see the shaded version as well.
> e.g. below for Message.class
>
> -bash-4.1$ jar -ftv
> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
> | grep protobuf | grep /Message.class
>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
>
>
> Does your project have other dependencies that might be indirectly pulling
> in protobuf 2.4.1? It would be helpful if you could list all of your
> dependencies including the exact Spark version and other libraries.
>
> I did have another one which I moved to the end of classpath - even ran
> partial code without that dependency but it still failed whenever I use the
> jar with ScalaBuf dependency.
> Spark version is 0.9.0
>
>
> ~Vipul
>
> On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:
>
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
> getting pulled in unless you are directly using akka yourself. Are you?
>
> Does your project have other dependencies that might be indirectly pulling
> in protobuf 2.4.1? It would be helpful if you could list all of your
> dependencies including the exact Spark version and other libraries.
>
> - Patrick
>
>
> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
>
>> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
>> issue. any word on this one?
>> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
>>
>> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming
>> 0.9 with
>> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
>> deployed
>> > on each of the spark worker nodes.
>> > The message is compiled using 2.5 but then on runtime it is being
>> > de-serialized by 2.4.1 as I'm getting the following exception
>> >
>> > java.lang.VerifyError (java.lang.VerifyError: class
>> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
>> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
>> > java.lang.ClassLoader.defineClass1(Native Method)
>> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> >
>> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article
>> -
>> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able
>> to
>> > use different version of protobuf in the application.
>> >
>> >
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com
>> .
>>
>>
>
>


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Kanwaldeep
I've removed the dependency on akka in a separate project but still running
into the same error. In the POM Dependency Hierarchy I do see 2.4.1 - shaded
and 2.5.0 being included. If there is a conflict with project dependency I
would think I should be getting the same error in my local setup as well.

Here is the dependencies I'm using.



ch.qos.logback
logback-core
1.0.13


ch.qos.logback
logback-classic
1.0.13



org.apache.spark
spark-core_2.10
0.9.0-incubating



org.apache.spark
spark-streaming_2.10
0.9.0-incubating


org.apache.spark
spark-streaming-kafka_2.10
0.9.0-incubating




  
org.apache.hbase
hbase
0.94.15-cdh4.6.0

  


org.apache.hadoop
hadoop-client
2.0.0-cdh4.6.0
  

com.google.protobuf
protobuf-java
2.5.0
 


org.slf4j
slf4j-api
1.7.5




org.scala-lang
scala-library
2.10.2



org.scala-lang
scala-actors
2.10.2


org.scala-lang
scala-reflect
2.10.2



org.slf4j
slf4j-api
1.7.5







--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p3585.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Kanwaldeep
Yes I'm using akka as well. But if that is the problem then I should have
been facing this issue in my local setup as well. I'm only running into this
error on using the spark standalone cluster.

But will try out your suggestion and let you know.

Thanks
Kanwal



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396p3582.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Vipul Pandey
btw, this is where it fails 




14/04/01 00:59:32 INFO storage.MemoryStore: ensureFreeSpace(84106) called with 
curMem=0, maxMem=4939225497
14/04/01 00:59:32 INFO storage.MemoryStore: Block broadcast_0 stored as values 
to memory (estimated size 82.1 KB, free 4.6 GB)
java.lang.UnsupportedOperationException: This is supposed to be overridden by 
subclasses.
at 
com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:30042)
at 
com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193)
at $Proxy14.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy14.getFileInfo(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:628)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:805)
at 
org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1670)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1616)
at 
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:174)
at 
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:205)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:140)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
at 
org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:30)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
at 
org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:58)
at 
org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:354)



On Apr 1, 2014, at 12:53 AM, Vipul Pandey  wrote:

>> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
>> getting pulled in unless you are directly using akka yourself. Are you?
> 
> No i'm not. Although I see that protobuf libraries are directly pulled into 
> the 0.9.0 assembly jar - I do see the shaded version as well. 
> e.g. below for Message.class
> 
> -bash-4.1$ jar -ftv 
> ./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
>  | grep protobuf | grep /Message.class
>478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
>508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class
> 
> 
>> Does your project have other dependencies that might be indirectly pulling 
>> in protobuf 2.4.1? It would be helpful if you could list all of your 
>> dependencies including the exa

Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-04-01 Thread Vipul Pandey
> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
> getting pulled in unless you are directly using akka yourself. Are you?

No i'm not. Although I see that protobuf libraries are directly pulled into the 
0.9.0 assembly jar - I do see the shaded version as well. 
e.g. below for Message.class

-bash-4.1$ jar -ftv 
./assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop2.0.0-cdh4.2.1.jar
 | grep protobuf | grep /Message.class
   478 Thu Jun 30 15:26:12 PDT 2011 com/google/protobuf/Message.class
   508 Sat Dec 14 14:20:38 PST 2013 com/google/protobuf_spark/Message.class


> Does your project have other dependencies that might be indirectly pulling in 
> protobuf 2.4.1? It would be helpful if you could list all of your 
> dependencies including the exact Spark version and other libraries.

I did have another one which I moved to the end of classpath - even ran partial 
code without that dependency but it still failed whenever I use the jar with 
ScalaBuf dependency. 
Spark version is 0.9.0


~Vipul

On Mar 31, 2014, at 4:51 PM, Patrick Wendell  wrote:

> Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be 
> getting pulled in unless you are directly using akka yourself. Are you?
> 
> Does your project have other dependencies that might be indirectly pulling in 
> protobuf 2.4.1? It would be helpful if you could list all of your 
> dependencies including the exact Spark version and other libraries.
> 
> - Patrick
> 
> 
> On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:
> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. 
> any word on this one?
> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
> 
> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 
> > with
> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
> > on each of the spark worker nodes.
> > The message is compiled using 2.5 but then on runtime it is being
> > de-serialized by 2.4.1 as I'm getting the following exception
> >
> > java.lang.VerifyError (java.lang.VerifyError: class
> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
> > java.lang.ClassLoader.defineClass1(Native Method)
> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >
> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
> > use different version of protobuf in the application.
> >
> >
> >
> >
> >
> > --
> > View this message in context: 
> > http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> 



Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-03-31 Thread Patrick Wendell
Spark now shades its own protobuf dependency so protobuf 2.4.1 should't be
getting pulled in unless you are directly using akka yourself. Are you?

Does your project have other dependencies that might be indirectly pulling
in protobuf 2.4.1? It would be helpful if you could list all of your
dependencies including the exact Spark version and other libraries.

- Patrick


On Sun, Mar 30, 2014 at 10:03 PM, Vipul Pandey  wrote:

> I'm using ScalaBuff (which depends on protobuf2.5) and facing the same
> issue. any word on this one?
> On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:
>
> > We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9
> with
> > Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar
> deployed
> > on each of the spark worker nodes.
> > The message is compiled using 2.5 but then on runtime it is being
> > de-serialized by 2.4.1 as I'm getting the following exception
> >
> > java.lang.VerifyError (java.lang.VerifyError: class
> > com.snc.sinet.messages.XServerMessage$XServer overrides final method
> > getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
> > java.lang.ClassLoader.defineClass1(Native Method)
> > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >
> > Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
> > https://spark-project.atlassian.net/browse/SPARK-995 we should be able
> to
> > use different version of protobuf in the application.
> >
> >
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>


Re: Using ProtoBuf 2.5 for messages with Spark Streaming

2014-03-30 Thread Vipul Pandey
I'm using ScalaBuff (which depends on protobuf2.5) and facing the same issue. 
any word on this one?
On Mar 27, 2014, at 6:41 PM, Kanwaldeep  wrote:

> We are using Protocol Buffer 2.5 to send messages to Spark Streaming 0.9 with
> Kafka stream setup. I have protocol Buffer 2.5 part of the uber jar deployed
> on each of the spark worker nodes.  
> The message is compiled using 2.5 but then on runtime it is being
> de-serialized by 2.4.1 as I'm getting the following exception
> 
> java.lang.VerifyError (java.lang.VerifyError: class
> com.snc.sinet.messages.XServerMessage$XServer overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;)
> java.lang.ClassLoader.defineClass1(Native Method)
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> 
> Suggestions on how I could still use ProtoBuf 2.5. Based on the article -
> https://spark-project.atlassian.net/browse/SPARK-995 we should be able to
> use different version of protobuf in the application.
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-ProtoBuf-2-5-for-messages-with-Spark-Streaming-tp3396.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.