I am happy to report that after set spark.dirver.userClassPathFirst, I can use
protobuf 3 with spark-shell. Looks like the classloading issue in the driver,
not executor.
Marcelo, thank you very much for the tip!
Lan
> On Sep 15, 2015, at 1:40 PM, Marcelo Vanzin wrote:
Hi,
Just "spark.executor.userClassPathFirst" is not enough. You should
also set "spark.driver.userClassPathFirst". Also not that I don't
think this was really tested with the shell, but that should work with
regular apps started using spark-submit.
If that doesn't work, I'd recommend shading, as
On 15 Sep 2015, at 05:47, Lan Jiang
> wrote:
Hi, there,
I am using Spark 1.4.1. The protobuf 2.5 is included by Spark 1.4.1 by default.
However, I would like to use Protobuf 3 in my spark application so that I can
use some new features such as Map
Yong
>
> Date: Tue, 15 Sep 2015 09:33:40 -0500
> Subject: Re: Change protobuf version or any other third party library version
> in Spark application
> From: ljia...@gmail.com
> To: java8...@hotmail.com
> CC: ste...@hortonworks.com; user@spark.apache.org
>
> Steve,
>
parameter:
>
> https://issues.apache.org/jira/browse/SPARK-2996
> <https://issues.apache.org/jira/browse/SPARK-2996>
>
> Yong
>
> Subject: Re: Change protobuf version or any other third party library version
> in Spark application
> From: ste...@hortonworks.com
If you use Standalone mode, just start spark-shell like following:
spark-shell --jars your_uber_jar --conf spark.files.userClassPathFirst=true
Yong
Date: Tue, 15 Sep 2015 09:33:40 -0500
Subject: Re: Change protobuf version or any other third party library version
in Spark application
From: ljia
setting depends on your deployment mode, check this for the parameter:
>
> https://issues.apache.org/jira/browse/SPARK-2996
>
> Yong
>
> ----------
> Subject: Re: Change protobuf version or any other third party library
> version in Spark application
> From:
ter:
https://issues.apache.org/jira/browse/SPARK-2996
Yong
Subject: Re: Change protobuf version or any other third party library version
in Spark application
From: ste...@hortonworks.com
To: ljia...@gmail.com
CC: user@spark.apache.org
Date: Tue, 15 Sep 2015 09:19:28 +
On 15 Sep 2
Hi, there,
I am using Spark 1.4.1. The protobuf 2.5 is included by Spark 1.4.1 by
default. However, I would like to use Protobuf 3 in my spark application so
that I can use some new features such as Map support. Is there anyway to
do that?
Right now if I build a uber.jar with dependencies