Re: Hive error after update from 1.4.1 to 1.5.2

2015-12-16 Thread Bryan Jeffrey
I had a bunch of library dependencies that were still using Scala 2.10
versions. I updated them to 2.11 and everything has worked fine since.

On Wed, Dec 16, 2015 at 3:12 AM, Ashwin Sai Shankar 
wrote:

> Hi Bryan,
> I see the same issue with 1.5.2,  can you pls let me know what was the
> resolution?
>
> Thanks,
> Ashwin
>
> On Fri, Nov 20, 2015 at 12:07 PM, Bryan Jeffrey 
> wrote:
>
>> Nevermind. I had a library dependency that still had the old Spark
>> version.
>>
>> On Fri, Nov 20, 2015 at 2:14 PM, Bryan Jeffrey 
>> wrote:
>>
>>> The 1.5.2 Spark was compiled using the following options:  mvn
>>> -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
>>> -Phive-thriftserver clean package
>>>
>>> Regards,
>>>
>>> Bryan Jeffrey
>>>
>>> On Fri, Nov 20, 2015 at 2:13 PM, Bryan Jeffrey 
>>> wrote:
>>>
 Hello.

 I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
 1.5.2.  Has anyone seen this issue?

 I'm invoking the following:

 new HiveContext(sc) // sc is a Spark Context

 I am seeing the following error:

 SLF4J: Class path contains multiple SLF4J bindings.
 SLF4J: Found binding in
 [jar:file:/spark/spark-1.5.2/assembly/target/scala-2.11/spark-assembly-1.5.2-hadoop2.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in
 [jar:file:/hadoop-2.6.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
 explanation.
 SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 Exception in thread "main" java.lang.NoSuchMethodException:
 org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,
 java.util.concurrent.TimeUnit)
 at java.lang.Class.getMethod(Class.java:1786)
 at
 org.apache.spark.sql.hive.client.Shim.findMethod(HiveShim.scala:114)
 at
 org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod$lzycompute(HiveShim.scala:415)
 at
 org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod(HiveShim.scala:414)
 at
 org.apache.spark.sql.hive.client.Shim_v0_14.getMetastoreClientConnectRetryDelayMillis(HiveShim.scala:459)
 at
 org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:198)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 at
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at
 java.lang.reflect.Constructor.newInstance(Constructor.java:422)
 at
 org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
 at
 org.apache.spark.sql.hive.client.IsolatedClientLoader.(IsolatedClientLoader.scala:179)
 at
 org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
 at
 org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
 at
 org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
 at
 org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
 at
 org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:177)
 at
 Main.Factories.HiveContextSingleton$.createHiveContext(HiveContextSingleton.scala:21)
 at
 Main.Factories.HiveContextSingleton$.getHiveContext(HiveContextSingleton.scala:14)
 at
 Main.Factories.SparkStreamingContextFactory$.createSparkContext(SparkStreamingContextFactory.scala:35)
 at Main.WriteModel$.main(WriteModel.scala:16)
 at Main.WriteModel.main(WriteModel.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:497)
 at
 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
 at
 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
 at
 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
 at
 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

>>>
>>>
>>
>


Re: Hive error after update from 1.4.1 to 1.5.2

2015-12-16 Thread Ashwin Sai Shankar
Hi Bryan,
I see the same issue with 1.5.2,  can you pls let me know what was the
resolution?

Thanks,
Ashwin

On Fri, Nov 20, 2015 at 12:07 PM, Bryan Jeffrey 
wrote:

> Nevermind. I had a library dependency that still had the old Spark version.
>
> On Fri, Nov 20, 2015 at 2:14 PM, Bryan Jeffrey 
> wrote:
>
>> The 1.5.2 Spark was compiled using the following options:  mvn
>> -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
>> -Phive-thriftserver clean package
>>
>> Regards,
>>
>> Bryan Jeffrey
>>
>> On Fri, Nov 20, 2015 at 2:13 PM, Bryan Jeffrey 
>> wrote:
>>
>>> Hello.
>>>
>>> I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
>>> 1.5.2.  Has anyone seen this issue?
>>>
>>> I'm invoking the following:
>>>
>>> new HiveContext(sc) // sc is a Spark Context
>>>
>>> I am seeing the following error:
>>>
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/spark/spark-1.5.2/assembly/target/scala-2.11/spark-assembly-1.5.2-hadoop2.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/hadoop-2.6.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> Exception in thread "main" java.lang.NoSuchMethodException:
>>> org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,
>>> java.util.concurrent.TimeUnit)
>>> at java.lang.Class.getMethod(Class.java:1786)
>>> at
>>> org.apache.spark.sql.hive.client.Shim.findMethod(HiveShim.scala:114)
>>> at
>>> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod$lzycompute(HiveShim.scala:415)
>>> at
>>> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod(HiveShim.scala:414)
>>> at
>>> org.apache.spark.sql.hive.client.Shim_v0_14.getMetastoreClientConnectRetryDelayMillis(HiveShim.scala:459)
>>> at
>>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:198)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at
>>> java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
>>> at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader.(IsolatedClientLoader.scala:179)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:177)
>>> at
>>> Main.Factories.HiveContextSingleton$.createHiveContext(HiveContextSingleton.scala:21)
>>> at
>>> Main.Factories.HiveContextSingleton$.getHiveContext(HiveContextSingleton.scala:14)
>>> at
>>> Main.Factories.SparkStreamingContextFactory$.createSparkContext(SparkStreamingContextFactory.scala:35)
>>> at Main.WriteModel$.main(WriteModel.scala:16)
>>> at Main.WriteModel.main(WriteModel.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>
>>
>


Hive error after update from 1.4.1 to 1.5.2

2015-11-20 Thread Bryan Jeffrey
Hello.

I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
1.5.2.  Has anyone seen this issue?

I'm invoking the following:

new HiveContext(sc) // sc is a Spark Context

I am seeing the following error:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/spark/spark-1.5.2/assembly/target/scala-2.11/spark-assembly-1.5.2-hadoop2.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/hadoop-2.6.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodException:
org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,
java.util.concurrent.TimeUnit)
at java.lang.Class.getMethod(Class.java:1786)
at
org.apache.spark.sql.hive.client.Shim.findMethod(HiveShim.scala:114)
at
org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod$lzycompute(HiveShim.scala:415)
at
org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod(HiveShim.scala:414)
at
org.apache.spark.sql.hive.client.Shim_v0_14.getMetastoreClientConnectRetryDelayMillis(HiveShim.scala:459)
at
org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:198)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
at
org.apache.spark.sql.hive.client.IsolatedClientLoader.(IsolatedClientLoader.scala:179)
at
org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
at
org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
at
org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
at
org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
at
org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:177)
at
Main.Factories.HiveContextSingleton$.createHiveContext(HiveContextSingleton.scala:21)
at
Main.Factories.HiveContextSingleton$.getHiveContext(HiveContextSingleton.scala:14)
at
Main.Factories.SparkStreamingContextFactory$.createSparkContext(SparkStreamingContextFactory.scala:35)
at Main.WriteModel$.main(WriteModel.scala:16)
at Main.WriteModel.main(WriteModel.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Re: Hive error after update from 1.4.1 to 1.5.2

2015-11-20 Thread Bryan Jeffrey
The 1.5.2 Spark was compiled using the following options:  mvn
-Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
-Phive-thriftserver clean package

Regards,

Bryan Jeffrey

On Fri, Nov 20, 2015 at 2:13 PM, Bryan Jeffrey 
wrote:

> Hello.
>
> I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
> 1.5.2.  Has anyone seen this issue?
>
> I'm invoking the following:
>
> new HiveContext(sc) // sc is a Spark Context
>
> I am seeing the following error:
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/spark/spark-1.5.2/assembly/target/scala-2.11/spark-assembly-1.5.2-hadoop2.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/hadoop-2.6.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> Exception in thread "main" java.lang.NoSuchMethodException:
> org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,
> java.util.concurrent.TimeUnit)
> at java.lang.Class.getMethod(Class.java:1786)
> at
> org.apache.spark.sql.hive.client.Shim.findMethod(HiveShim.scala:114)
> at
> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod$lzycompute(HiveShim.scala:415)
> at
> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod(HiveShim.scala:414)
> at
> org.apache.spark.sql.hive.client.Shim_v0_14.getMetastoreClientConnectRetryDelayMillis(HiveShim.scala:459)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:198)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader.(IsolatedClientLoader.scala:179)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
> at
> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
> at
> org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
> at
> org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
> at
> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:177)
> at
> Main.Factories.HiveContextSingleton$.createHiveContext(HiveContextSingleton.scala:21)
> at
> Main.Factories.HiveContextSingleton$.getHiveContext(HiveContextSingleton.scala:14)
> at
> Main.Factories.SparkStreamingContextFactory$.createSparkContext(SparkStreamingContextFactory.scala:35)
> at Main.WriteModel$.main(WriteModel.scala:16)
> at Main.WriteModel.main(WriteModel.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
> at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>


Re: Hive error after update from 1.4.1 to 1.5.2

2015-11-20 Thread Bryan Jeffrey
Nevermind. I had a library dependency that still had the old Spark version.

On Fri, Nov 20, 2015 at 2:14 PM, Bryan Jeffrey 
wrote:

> The 1.5.2 Spark was compiled using the following options:  mvn
> -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive
> -Phive-thriftserver clean package
>
> Regards,
>
> Bryan Jeffrey
>
> On Fri, Nov 20, 2015 at 2:13 PM, Bryan Jeffrey 
> wrote:
>
>> Hello.
>>
>> I'm seeing an error creating a Hive Context moving from Spark 1.4.1 to
>> 1.5.2.  Has anyone seen this issue?
>>
>> I'm invoking the following:
>>
>> new HiveContext(sc) // sc is a Spark Context
>>
>> I am seeing the following error:
>>
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/spark/spark-1.5.2/assembly/target/scala-2.11/spark-assembly-1.5.2-hadoop2.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/hadoop-2.6.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> Exception in thread "main" java.lang.NoSuchMethodException:
>> org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars,
>> java.util.concurrent.TimeUnit)
>> at java.lang.Class.getMethod(Class.java:1786)
>> at
>> org.apache.spark.sql.hive.client.Shim.findMethod(HiveShim.scala:114)
>> at
>> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod$lzycompute(HiveShim.scala:415)
>> at
>> org.apache.spark.sql.hive.client.Shim_v0_14.getTimeVarMethod(HiveShim.scala:414)
>> at
>> org.apache.spark.sql.hive.client.Shim_v0_14.getMetastoreClientConnectRetryDelayMillis(HiveShim.scala:459)
>> at
>> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:198)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:183)
>> at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader.(IsolatedClientLoader.scala:179)
>> at
>> org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:226)
>> at
>> org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:185)
>> at
>> org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:392)
>> at
>> org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:174)
>> at
>> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:177)
>> at
>> Main.Factories.HiveContextSingleton$.createHiveContext(HiveContextSingleton.scala:21)
>> at
>> Main.Factories.HiveContextSingleton$.getHiveContext(HiveContextSingleton.scala:14)
>> at
>> Main.Factories.SparkStreamingContextFactory$.createSparkContext(SparkStreamingContextFactory.scala:35)
>> at Main.WriteModel$.main(WriteModel.scala:16)
>> at Main.WriteModel.main(WriteModel.scala)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
>> at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>> at
>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>> at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>
>