I had a bunch of library dependencies that were still using Scala 2.10
versions. I updated them to 2.11 and everything has worked fine since.
On Wed, Dec 16, 2015 at 3:12 AM, Ashwin Sai Shankar
wrote:
> Hi Bryan,
> I see the same issue with 1.5.2, can you pls let me know what was the
> resoluti
Hi Bryan,
I see the same issue with 1.5.2, can you pls let me know what was the
resolution?
Thanks,
Ashwin
On Fri, Nov 20, 2015 at 12:07 PM, Bryan Jeffrey
wrote:
> Nevermind. I had a library dependency that still had the old Spark version.
>
> On Fri, Nov 20, 2015 at 2:14 PM, Bryan Jeffrey
>
Nevermind. I had a library dependency that still had the old Spark version.
On Fri, Nov 20, 2015 at 2:14 PM, Bryan Jeffrey
wrote:
> The 1.5.2 Spark was compiled using the following options: mvn
> -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
> -Phive-thriftserver clean package
>
The 1.5.2 Spark was compiled using the following options: mvn
-Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
-Phive-thriftserver clean package
Regards,
Bryan Jeffrey
On Fri, Nov 20, 2015 at 2:13 PM, Bryan Jeffrey
wrote:
> Hello.
>
> I'm seeing an error creating a Hive Context m
Hello.
I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
1.5.2. Has anyone seen this issue?
I'm invoking the following:
new HiveContext(sc) // sc is a Spark Context
I am seeing the following error:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding i