Re: Unable to run hive queries inside spark

2015-02-27 Thread sandeep vura
Hi Kundan,

Sorry even i am also facing the similar issue today.How did you resolve
this issue?

Regards,
Sandeep.v

On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust mich...@databricks.com
wrote:

 It looks like that is getting interpreted as a local path.  Are you
 missing a core-site.xml file to configure hdfs?

 On Tue, Feb 24, 2015 at 10:40 PM, kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi Denny,

 yes the user has all the rights to HDFS. I am running all the spark
 operations with this user.

 and my hive-site.xml looks like this

  property
 namehive.metastore.warehouse.dir/name
 value/user/hive/warehouse/value
 descriptionlocation of default database for the
 warehouse/description
   /property

 Do I need to do anything explicitly other than placing hive-site.xml in
 the spark.conf directory ?

 Thanks !!



 On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee denny.g@gmail.com
 wrote:

 The error message you have is:

 FAILED: Execution Error, return code 1 from 
 org.apache.hadoop.hive.ql.exec.DDLTask.
 MetaException(message:file:/user/hive/warehouse/src is not a directory
 or unable to create one)

 Could you verify that you (the user you are running under) has the
 rights to create the necessary folders within HDFS?


 On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation
 for details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF
 NOT EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin
 classes with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
 we are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not
 creating a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF
 NOT EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 

Re: Unable to run hive queries inside spark

2015-02-25 Thread Michael Armbrust
It looks like that is getting interpreted as a local path.  Are you missing
a core-site.xml file to configure hdfs?

On Tue, Feb 24, 2015 at 10:40 PM, kundan kumar iitr.kun...@gmail.com
wrote:

 Hi Denny,

 yes the user has all the rights to HDFS. I am running all the spark
 operations with this user.

 and my hive-site.xml looks like this

  property
 namehive.metastore.warehouse.dir/name
 value/user/hive/warehouse/value
 descriptionlocation of default database for the
 warehouse/description
   /property

 Do I need to do anything explicitly other than placing hive-site.xml in
 the spark.conf directory ?

 Thanks !!



 On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee denny.g@gmail.com wrote:

 The error message you have is:

 FAILED: Execution Error, return code 1 from 
 org.apache.hadoop.hive.ql.exec.DDLTask.
 MetaException(message:file:/user/hive/warehouse/src is not a directory
 or unable to create one)

 Could you verify that you (the user you are running under) has the rights
 to create the necessary folders within HDFS?


 On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation
 for details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
 with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
 we are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not
 creating a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=semanticAnalyze
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:19 INFO SemanticAnalyzer: Starting Semantic 

Re: Unable to run hive queries inside spark

2015-02-24 Thread kundan kumar
Hi Denny,

yes the user has all the rights to HDFS. I am running all the spark
operations with this user.

and my hive-site.xml looks like this

 property
namehive.metastore.warehouse.dir/name
value/user/hive/warehouse/value
descriptionlocation of default database for the
warehouse/description
  /property

Do I need to do anything explicitly other than placing hive-site.xml in the
spark.conf directory ?

Thanks !!



On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee denny.g@gmail.com wrote:

 The error message you have is:

 FAILED: Execution Error, return code 1 from 
 org.apache.hadoop.hive.ql.exec.DDLTask.
 MetaException(message:file:/user/hive/warehouse/src is not a directory or
 unable to create one)

 Could you verify that you (the user you are running under) has the rights
 to create the necessary folders within HDFS?


 On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation
 for details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
 with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
 we are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not creating
 a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=semanticAnalyze
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:19 INFO SemanticAnalyzer: Starting Semantic Analysis
 15/02/25 10:31:19 INFO SemanticAnalyzer: Creating table src position=27
 15/02/25 10:31:19 INFO HiveMetaStore: 0: get_table : db=default tbl=src
 15/02/25 10:31:19 INFO audit: ugi=spuser 

Re: Unable to run hive queries inside spark

2015-02-24 Thread Denny Lee
The error message you have is:

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:file:/user/hive/warehouse/src is not a directory or
unable to create one)

Could you verify that you (the user you are running under) has the rights
to create the necessary folders within HDFS?


On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation for
 details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
 with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming we
 are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not creating
 a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=semanticAnalyze
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:19 INFO SemanticAnalyzer: Starting Semantic Analysis
 15/02/25 10:31:19 INFO SemanticAnalyzer: Creating table src position=27
 15/02/25 10:31:19 INFO HiveMetaStore: 0: get_table : db=default tbl=src
 15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr cmd=get_table
 : db=default tbl=src
 15/02/25 10:31:19 INFO HiveMetaStore: 0: get_database: default
 15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr cmd=get_database:
 default
 15/02/25 10:31:19 INFO Driver: Semantic Analysis Completed
 15/02/25 10:31:19 INFO PerfLogger: /PERFLOG method=semanticAnalyze
 start=1424840478986 end=1424840479063 duration=77
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:19 INFO Driver: Returning Hive schema:
 Schema(fieldSchemas:null, properties:null)
 

Re: Unable to run hive queries inside spark

2015-02-24 Thread Denny Lee
That's all you should need to do. Saying this, I did run into an issue
similar to this when I was switching Spark versions which were tied to
different default Hive versions (eg Spark 1.3 by default works with Hive
0.13.1). I'm wondering if you may be hitting this issue due to that?
On Tue, Feb 24, 2015 at 22:40 kundan kumar iitr.kun...@gmail.com wrote:

 Hi Denny,

 yes the user has all the rights to HDFS. I am running all the spark
 operations with this user.

 and my hive-site.xml looks like this

  property
 namehive.metastore.warehouse.dir/name
 value/user/hive/warehouse/value
 descriptionlocation of default database for the
 warehouse/description
   /property

 Do I need to do anything explicitly other than placing hive-site.xml in
 the spark.conf directory ?

 Thanks !!



 On Wed, Feb 25, 2015 at 11:42 AM, Denny Lee denny.g@gmail.com wrote:

 The error message you have is:

 FAILED: Execution Error, return code 1 from 
 org.apache.hadoop.hive.ql.exec.DDLTask.
 MetaException(message:file:/user/hive/warehouse/src is not a directory
 or unable to create one)

 Could you verify that you (the user you are running under) has the rights
 to create the necessary folders within HDFS?


 On Tue, Feb 24, 2015 at 9:06 PM kundan kumar iitr.kun...@gmail.com
 wrote:

 Hi ,

 I have placed my hive-site.xml inside spark/conf and i am trying to
 execute some hive queries given in the documentation.

 Can you please suggest what wrong am I doing here.



 scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
 hiveContext: org.apache.spark.sql.hive.HiveContext =
 org.apache.spark.sql.hive.HiveContext@3340a4b8

 scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
 STRING))
 warning: there were 1 deprecation warning(s); re-run with -deprecation
 for details
 15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:30:59 INFO ParseDriver: Parse Completed
 15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
 implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
 15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
 15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
 unknown - will be ignored
 15/02/25 10:30:59 INFO Persistence: Property
 hive.metastore.integral.jdo.pushdown unknown - will be ignored
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
 CLASSPATH (or one of dependencies)
 15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
 with
 hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
 15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming
 we are not on mysql: Lexical error at line 1, column 5.  Encountered: @
 (64), after : .
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:09 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:15 INFO Datastore: The class
 org.apache.hadoop.hive.metastore.model.MOrder is tagged as
 embedded-only so does not have its own datastore table.
 15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
 15/02/25 10:31:17 WARN ObjectStore: Version information not found in
 metastore. hive.metastore.schema.verification is not enabled so recording
 the schema version 0.13.1aa
 15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
 15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role,
 since config is empty
 15/02/25 10:31:18 INFO SessionState: No Tez session required at this
 point. hive.execution.engine=mr.
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not
 creating a lock manager
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
 from=org.apache.hadoop.hive.ql.Driver
 15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
 EXISTS src (key INT, value STRING)
 15/02/25 10:31:18 INFO ParseDriver: Parse Completed
 15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
 start=1424840478985 end=1424840478986 duration=1
 from=org.apache.hadoop.hive.ql.Driver
 

Unable to run hive queries inside spark

2015-02-24 Thread kundan kumar
Hi ,

I have placed my hive-site.xml inside spark/conf and i am trying to execute
some hive queries given in the documentation.

Can you please suggest what wrong am I doing here.



scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
hiveContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@3340a4b8

scala hiveContext.hql(CREATE TABLE IF NOT EXISTS src (key INT, value
STRING))
warning: there were 1 deprecation warning(s); re-run with -deprecation for
details
15/02/25 10:30:59 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
EXISTS src (key INT, value STRING)
15/02/25 10:30:59 INFO ParseDriver: Parse Completed
15/02/25 10:30:59 INFO HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/02/25 10:30:59 INFO ObjectStore: ObjectStore, initialize called
15/02/25 10:30:59 INFO Persistence: Property datanucleus.cache.level2
unknown - will be ignored
15/02/25 10:30:59 INFO Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
15/02/25 10:30:59 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
15/02/25 10:31:08 INFO ObjectStore: Setting MetaStore object pin classes
with
hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
15/02/25 10:31:08 INFO MetaStoreDirectSql: MySQL check failed, assuming we
are not on mysql: Lexical error at line 1, column 5.  Encountered: @
(64), after : .
15/02/25 10:31:09 INFO Datastore: The class
org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
embedded-only so does not have its own datastore table.
15/02/25 10:31:09 INFO Datastore: The class
org.apache.hadoop.hive.metastore.model.MOrder is tagged as
embedded-only so does not have its own datastore table.
15/02/25 10:31:15 INFO Datastore: The class
org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
embedded-only so does not have its own datastore table.
15/02/25 10:31:15 INFO Datastore: The class
org.apache.hadoop.hive.metastore.model.MOrder is tagged as
embedded-only so does not have its own datastore table.
15/02/25 10:31:17 INFO ObjectStore: Initialized ObjectStore
15/02/25 10:31:17 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording
the schema version 0.13.1aa
15/02/25 10:31:18 INFO HiveMetaStore: Added admin role in metastore
15/02/25 10:31:18 INFO HiveMetaStore: Added public role in metastore
15/02/25 10:31:18 INFO HiveMetaStore: No user is added in admin role, since
config is empty
15/02/25 10:31:18 INFO SessionState: No Tez session required at this point.
hive.execution.engine=mr.
15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=Driver.run
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=TimeToSubmit
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:18 INFO Driver: Concurrency mode is disabled, not creating a
lock manager
15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=compile
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=parse
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:18 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT
EXISTS src (key INT, value STRING)
15/02/25 10:31:18 INFO ParseDriver: Parse Completed
15/02/25 10:31:18 INFO PerfLogger: /PERFLOG method=parse
start=1424840478985 end=1424840478986 duration=1
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:18 INFO PerfLogger: PERFLOG method=semanticAnalyze
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:19 INFO SemanticAnalyzer: Starting Semantic Analysis
15/02/25 10:31:19 INFO SemanticAnalyzer: Creating table src position=27
15/02/25 10:31:19 INFO HiveMetaStore: 0: get_table : db=default tbl=src
15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr cmd=get_table :
db=default tbl=src
15/02/25 10:31:19 INFO HiveMetaStore: 0: get_database: default
15/02/25 10:31:19 INFO audit: ugi=spuser ip=unknown-ip-addr cmd=get_database:
default
15/02/25 10:31:19 INFO Driver: Semantic Analysis Completed
15/02/25 10:31:19 INFO PerfLogger: /PERFLOG method=semanticAnalyze
start=1424840478986 end=1424840479063 duration=77
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:19 INFO Driver: Returning Hive schema:
Schema(fieldSchemas:null, properties:null)
15/02/25 10:31:19 INFO PerfLogger: /PERFLOG method=compile
start=1424840478970 end=1424840479069 duration=99
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:19 INFO PerfLogger: PERFLOG method=Driver.execute
from=org.apache.hadoop.hive.ql.Driver
15/02/25 10:31:19 INFO Driver: Starting command: CREATE TABLE IF NOT EXISTS
src (key INT, value STRING)
15/02/25 10:31:19 INFO PerfLogger: /PERFLOG method=TimeToSubmit
start=1424840478968 end=1424840479072 duration=104