I was actually just able to reproduce the  issue. I do wonder if this is a
bug -- the docs say "When not configured by the hive-site.xml, the context
automatically creates metastore_db and warehouse in the current directory."
But as you can see in from the message warehouse is not in the current
directory, it is under /user/hive. In my case this directory was owned by
'root' and noone else had write permissions. Changing the permissions works
if you need to get unblocked quickly...But it does seem like a bug to me...


On Fri, Feb 27, 2015 at 11:21 AM, sandeep vura <sandeepv...@gmail.com>
wrote:

> Hi yana,
>
> I have removed hive-site.xml from spark/conf directory but still getting
> the same errors. Anyother way to work around.
>
> Regards,
> Sandeep
>
> On Fri, Feb 27, 2015 at 9:38 PM, Yana Kadiyska <yana.kadiy...@gmail.com>
> wrote:
>
>> I think you're mixing two things: the docs say "When* not *configured by
>> the hive-site.xml, the context automatically creates metastore_db and
>> warehouse in the current directory.". AFAIK if you want a local
>> metastore, you don't put hive-site.xml anywhere. You only need the file if
>> you're going to point to an external metastore. If you're pointing to an
>> external metastore, in my experience I've also had to copy core-site.xml
>> into conf in order to specify this property:  <name>fs.defaultFS</name>
>>
>> On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura <sandeepv...@gmail.com>
>> wrote:
>>
>>> Hi Sparkers,
>>>
>>> I am using hive version - hive 0.13 and copied hive-site.xml in
>>> spark/conf and using default derby local metastore .
>>>
>>> While creating a table in spark shell getting the following error ..Can
>>> any one please look and give solution asap..
>>>
>>> sqlContext.sql("CREATE TABLE IF NOT EXISTS sandeep (key INT, value
>>> STRING)")
>>> 15/02/27 23:06:13 ERROR RetryingHMSHandler:
>>> MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
>>> directory or unable to create one)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
>>>     at
>>> com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown Source)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>     at com.sun.proxy.$Proxy13.createTable(Unknown Source)
>>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
>>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>>>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>>     at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>>     at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
>>>     at $line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>>     at $line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
>>>     at $line9.$read$$iwC$$iwC.<init>(<console>:22)
>>>     at $line9.$read$$iwC.<init>(<console>:24)
>>>     at $line9.$read.<init>(<console>:26)
>>>     at $line9.$read$.<init>(<console>:30)
>>>     at $line9.$read$.<clinit>(<console>)
>>>     at $line9.$eval$.<init>(<console>:7)
>>>     at $line9.$eval$.<clinit>(<console>)
>>>     at $line9.$eval.$print(<console>)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>>     at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>>     at
>>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>>     at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>>     at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>>     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>>     at
>>> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>>     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>>     at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>>     at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>     at org.apache.spark.repl.Main.main(Main.scala)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> 15/02/27 23:06:13 ERROR DDLTask:
>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>> MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
>>> directory or unable to create one)
>>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:619)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
>>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>>>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>>     at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>>     at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
>>>     at $line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>>     at $line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
>>>     at $line9.$read$$iwC$$iwC.<init>(<console>:22)
>>>     at $line9.$read$$iwC.<init>(<console>:24)
>>>     at $line9.$read.<init>(<console>:26)
>>>     at $line9.$read$.<init>(<console>:30)
>>>     at $line9.$read$.<clinit>(<console>)
>>>     at $line9.$eval$.<init>(<console>:7)
>>>     at $line9.$eval$.<clinit>(<console>)
>>>     at $line9.$eval.$print(<console>)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>>     at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>>     at
>>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>>     at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>>     at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>>     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>>     at
>>> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>>     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>>     at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>>     at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>     at org.apache.spark.repl.Main.main(Main.scala)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> Caused by: MetaException(message:file:/user/hive/warehouse_1/sandeep is
>>> not a directory or unable to create one)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
>>>     at
>>> com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown Source)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
>>>     at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>     at com.sun.proxy.$Proxy13.createTable(Unknown Source)
>>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>>     ... 60 more
>>>
>>> 15/02/27 23:06:13 ERROR Driver: FAILED: Execution Error, return code 1
>>> from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
>>> directory or unable to create one)
>>> 15/02/27 23:06:13 ERROR HiveContext:
>>> ======================
>>> HIVE FAILURE OUTPUT
>>> ======================
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
>>> directory or unable to create one)
>>>
>>> ======================
>>> END HIVE FAILURE OUTPUT
>>> ======================
>>>
>>> org.apache.spark.sql.execution.QueryExecutionException: FAILED:
>>> Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse_1/sandeep is not a
>>> directory or unable to create one)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:309)
>>>     at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>>     at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>>     at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>>     at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>>     at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>>     at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
>>>     at $iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>>     at $iwC$$iwC$$iwC.<init>(<console>:20)
>>>     at $iwC$$iwC.<init>(<console>:22)
>>>     at $iwC.<init>(<console>:24)
>>>     at <init>(<console>:26)
>>>     at .<init>(<console>:30)
>>>     at .<clinit>(<console>)
>>>     at .<init>(<console>:7)
>>>     at .<clinit>(<console>)
>>>     at $print(<console>)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>>     at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>>     at
>>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>>     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>>     at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>>     at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>>     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>>     at
>>> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>>     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>>     at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>>     at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>>     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>>     at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>     at org.apache.spark.repl.Main.main(Main.scala)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:622)
>>>     at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>
>>
>

Reply via email to