It looks like that is getting interpreted as a local path.  Are you missing
a core-site.xml file to configure hdfs?

On Tue, Feb 24, 2015 at 10:40 PM, kundan kumar <iitr.kun...@gmail.com>
wrote:

> Hi Denny,
>
> yes the user has all the rights to HDFS. I am running all the spark
> operations with this user.
>
> and my hive-site.xml looks like this
>
>  <property>
>     <name>hive.metastore.warehouse.dir</name>
>     <value>/user/hive/warehouse</value>
>     <description>location of default database for the
> warehouse</description>
>   </property>
>
> Do I need to do anything explicitly other than placing hive-site.xml in
> the spark.conf directory ?
>
> Thanks !!
>
>
>
> On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee <denny.g....@gmail.com> wrote:
>
>> The error message you have is:
>>
>> FAILED: Execution Error, return code 1 from 
>> org.apache.hadoop.hive.ql.exec.DDLTask.
>> MetaException(message:file:/user/hive/warehouse/src is not a directory
>> or unable to create one)
>>
>> Could you verify that you (the user you are running under) has the rights
>> to create the necessary folders within HDFS?
>>
>>
>> On Tue, Feb 24, 2015 at 9:06 PM kundan kumar <iitr.kun...@gmail.com>
>> wrote:
>>
>>> Hi ,
>>>
>>> I have placed my hive-site.xml inside spark/conf and i am trying to
>>> execute some hive queries given in the documentation.
>>>
>>> Can you please suggest what wrong am I doing here.
>>>
>>>
>>>
>>> scala> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>> hiveContext: org.apache.spark.sql.hive.HiveContext =
>>> org.apache.spark.sql.hive.HiveContext@3340a4b8
>>>
>>> scala> hiveContext.hql("CREATE TABLE IF NOT EXISTS src (key INT, value
>>> STRING)")
>>> warning: there were 1 deprecation warning(s); re-run with -deprecation
>>> for details
>>> 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
>>> EXISTS src (key INT, value STRING)
>>> 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
>>> 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
>>> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
>>> 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
>>> unknown - will be ignored
>>> 15/02/25 10:30:59 INFO Persistence: Property
>>> hive.metastore.integral.jdo.pushdown unknown - will be ignored
>>> 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
>>> CLASSPATH (or one of dependencies)
>>> 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
>>> CLASSPATH (or one of dependencies)
>>> 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
>>> with
>>> hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
>>> 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
>>> we are not on mysql: Lexical error at line 1, column 5.  Encountered: "@"
>>> (64), after : "".
>>> 15/02/25 10:31:09 INFO Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 15/02/25 10:31:09 INFO Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 15/02/25 10:31:15 INFO Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 15/02/25 10:31:15 INFO Datastore: The class
>>> "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>>> "embedded-only" so does not have its own datastore table.
>>> 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
>>> 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
>>> metastore. hive.metastore.schema.verification is not enabled so recording
>>> the schema version 0.13.1aa
>>> 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
>>> 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
>>> 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
>>> since config is empty
>>> 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
>>> point. hive.execution.engine=mr.
>>> 15/02/25 10:31:18 INFO PerfLogger: <PERFLOG method=Driver.run
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:18 INFO PerfLogger: <PERFLOG method=TimeToSubmit
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not
>>> creating a lock manager
>>> 15/02/25 10:31:18 INFO PerfLogger: <PERFLOG method=compile
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:18 INFO PerfLogger: <PERFLOG method=parse
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
>>> EXISTS src (key INT, value STRING)
>>> 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
>>> 15/02/25 10:31:18 INFO PerfLogger: </PERFLOG method=parse
>>> start=1424840478985 end=1424840478986 duration=1
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:18 INFO PerfLogger: <PERFLOG method=semanticAnalyze
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO SemanticAnalyzer: Starting Semantic Analysis
>>> 15/02/25 10:31:19 INFO SemanticAnalyzer: Creating table src position=27
>>> 15/02/25 10:31:19 INFO HiveMetaStore: 0: get_table : db=default tbl=src
>>> 15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr cmd=get_table
>>> : db=default tbl=src
>>> 15/02/25 10:31:19 INFO HiveMetaStore: 0: get_database: default
>>> 15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr 
>>> cmd=get_database:
>>> default
>>> 15/02/25 10:31:19 INFO Driver: Semantic Analysis Completed
>>> 15/02/25 10:31:19 INFO PerfLogger: </PERFLOG method=semanticAnalyze
>>> start=1424840478986 end=1424840479063 duration=77
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO Driver: Returning Hive schema:
>>> Schema(fieldSchemas:null, properties:null)
>>> 15/02/25 10:31:19 INFO PerfLogger: </PERFLOG method=compile
>>> start=1424840478970 end=1424840479069 duration=99
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO PerfLogger: <PERFLOG method=Driver.execute
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO Driver: Starting command: CREATE TABLE IF NOT
>>> EXISTS src (key INT, value STRING)
>>> 15/02/25 10:31:19 INFO PerfLogger: </PERFLOG method=TimeToSubmit
>>> start=1424840478968 end=1424840479072 duration=104
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO PerfLogger: <PERFLOG method=runTasks
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO PerfLogger: <PERFLOG method=task.DDL.Stage-0
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO DDLTask: Default to LazySimpleSerDe for table src
>>> 15/02/25 10:31:19 INFO HiveMetaStore: 0: create_table:
>>> Table(tableName:src, dbName:default, owner:spuser, createTime:1424840479,
>>> lastAccessTime:0, retention:0,
>>> sd:StorageDescriptor(cols:[FieldSchema(name:key, type:int, comment:null),
>>> FieldSchema(name:value, type:string, comment:null)], location:null,
>>> inputFormat:org.apache.hadoop.mapred.TextInputFormat,
>>> outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat,
>>> compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null,
>>> serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,
>>> parameters:{serialization.format=1}), bucketCols:[], sortCols:[],
>>> parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[],
>>> skewedColValueLocationMaps:{}), storedAsSubDirectories:false),
>>> partitionKeys:[], parameters:{}, viewOriginalText:null,
>>> viewExpandedText:null, tableType:MANAGED_TABLE)
>>> 15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr 
>>> cmd=create_table:
>>> Table(tableName:src, dbName:default, owner:spuser, createTime:1424840479,
>>> lastAccessTime:0, retention:0,
>>> sd:StorageDescriptor(cols:[FieldSchema(name:key, type:int, comment:null),
>>> FieldSchema(name:value, type:string, comment:null)], location:null,
>>> inputFormat:org.apache.hadoop.mapred.TextInputFormat,
>>> outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat,
>>> compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null,
>>> serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,
>>> parameters:{serialization.format=1}), bucketCols:[], sortCols:[],
>>> parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[],
>>> skewedColValueLocationMaps:{}), storedAsSubDirectories:false),
>>> partitionKeys:[], parameters:{}, viewOriginalText:null,
>>> viewExpandedText:null, tableType:MANAGED_TABLE)
>>> 15/02/25 10:31:19 ERROR RetryingHMSHandler:
>>> MetaException(message:file:/user/hive/warehouse/src is not a directory or
>>> unable to create one)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
>>> at com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown
>>> Source)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy13.createTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>> at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>>> at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>> at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>> at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:102)
>>> at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:106)
>>> at $line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>> at $line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
>>> at $line9.$read$$iwC$$iwC.<init>(<console>:22)
>>> at $line9.$read$$iwC.<init>(<console>:24)
>>> at $line9.$read.<init>(<console>:26)
>>> at $line9.$read$.<init>(<console>:30)
>>> at $line9.$read$.<clinit>(<console>)
>>> at $line9.$eval$.<init>(<console>:7)
>>> at $line9.$eval$.<clinit>(<console>)
>>> at $line9.$eval.$print(<console>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>> at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> 15/02/25 10:31:19 ERROR DDLTask:
>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>> MetaException(message:file:/user/hive/warehouse/src is not a directory or
>>> unable to create one)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:619)
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4189)
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>> at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503)
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)
>>> at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>> at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>> at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:102)
>>> at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:106)
>>> at $line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>> at $line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
>>> at $line9.$read$$iwC$$iwC.<init>(<console>:22)
>>> at $line9.$read$$iwC.<init>(<console>:24)
>>> at $line9.$read.<init>(<console>:26)
>>> at $line9.$read$.<init>(<console>:30)
>>> at $line9.$read$.<clinit>(<console>)
>>> at $line9.$eval$.<init>(<console>:7)
>>> at $line9.$eval$.<clinit>(<console>)
>>> at $line9.$eval.$print(<console>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>> at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> Caused by: MetaException(message:file:/user/hive/warehouse/src is not a
>>> directory or unable to create one)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1239)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
>>> at com.sun.proxy.$Proxy12.create_table_with_environment_context(Unknown
>>> Source)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
>>> at
>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy13.createTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>> ... 61 more
>>>
>>> 15/02/25 10:31:19 ERROR Driver: FAILED: Execution Error, return code 1
>>> from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse/src is not a directory or
>>> unable to create one)
>>> 15/02/25 10:31:19 INFO PerfLogger: </PERFLOG method=Driver.execute
>>> start=1424840479069 end=1424840479170 duration=101
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO PerfLogger: <PERFLOG method=releaseLocks
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 INFO PerfLogger: </PERFLOG method=releaseLocks
>>> start=1424840479170 end=1424840479170 duration=0
>>> from=org.apache.hadoop.hive.ql.Driver>
>>> 15/02/25 10:31:19 ERROR HiveContext:
>>> ======================
>>> HIVE FAILURE OUTPUT
>>> ======================
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse/src is not a directory or
>>> unable to create one)
>>>
>>> ======================
>>> END HIVE FAILURE OUTPUT
>>> ======================
>>>
>>> org.apache.spark.sql.execution.QueryExecutionException: FAILED:
>>> Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>>> MetaException(message:file:/user/hive/warehouse/src is not a directory or
>>> unable to create one)
>>> at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:309)
>>> at
>>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>>> at
>>> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
>>> at
>>> org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
>>> at
>>> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
>>> at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
>>> at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:102)
>>> at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:106)
>>> at $iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
>>> at $iwC$$iwC$$iwC.<init>(<console>:20)
>>> at $iwC$$iwC.<init>(<console>:22)
>>> at $iwC.<init>(<console>:24)
>>> at <init>(<console>:26)
>>> at .<init>(<console>:30)
>>> at .<clinit>(<console>)
>>> at .<init>(<console>:7)
>>> at .<clinit>(<console>)
>>> at $print(<console>)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>> at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>> at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>> at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>>
>>>
>>>
>>> Regards,
>>> Kundan
>>>
>>
>

Reply via email to