Hive Web Interface Database connectivity issue

2013-09-11 Thread Biswajit Nayak
Hi All,

I was trying to start the hive web server, the webserver came up but the
database connectivity is not happening.  throwing the below errors. Any
help will be very helpful.


  File: NucleusJDOHelper.java Line:425 method:
getJDOExceptionForNucleusException
  class: org.datanucleus.jdo.NucleusJDOHelper

  File: JDOPersistenceManagerFactory.java Line:601 method:
freezeConfiguration
  class: org.datanucleus.jdo.JDOPersistenceManagerFactory

  File: JDOPersistenceManagerFactory.java Line:286 method:
createPersistenceManagerFactory
  class: org.datanucleus.jdo.JDOPersistenceManagerFactory

  File: JDOPersistenceManagerFactory.java Line:182 method:
getPersistenceManagerFactory
  class: org.datanucleus.jdo.JDOPersistenceManagerFactory

  File: NativeMethodAccessorImpl.java Line:-2 method: invoke0
  class: sun.reflect.NativeMethodAccessorImpl

  File: NativeMethodAccessorImpl.java Line:39 method: invoke
  class: sun.reflect.NativeMethodAccessorImpl

  File: DelegatingMethodAccessorImpl.java Line:25 method: invoke
  class: sun.reflect.DelegatingMethodAccessorImpl

  File: Method.java Line:597 method: invoke
  class: java.lang.reflect.Method

  File: JDOHelper.java Line:1958 method: run
  class: javax.jdo.JDOHelper$16

  File: AccessController.java Line:-2 method: doPrivileged
  class: java.security.AccessController

  File: JDOHelper.java Line:1953 method: invoke
  class: javax.jdo.JDOHelper

  File: JDOHelper.java Line:1159 method:
invokeGetPersistenceManagerFactoryOnImplementation
  class: javax.jdo.JDOHelper

  File: JDOHelper.java Line:803 method: getPersistenceManagerFactory
  class: javax.jdo.JDOHelper

  File: JDOHelper.java Line:698 method: getPersistenceManagerFactory
  class: javax.jdo.JDOHelper

  File: ObjectStore.java Line:263 method: getPMF
  class: org.apache.hadoop.hive.metastore.ObjectStore

  File: ObjectStore.java Line:292 method: getPersistenceManager
  class: org.apache.hadoop.hive.metastore.ObjectStore

  File: ObjectStore.java Line:225 method: initialize
  class: org.apache.hadoop.hive.metastore.ObjectStore

  File: ObjectStore.java Line:200 method: setConf
  class: org.apache.hadoop.hive.metastore.ObjectStore

  File: ReflectionUtils.java Line:62 method: setConf
  class: org.apache.hadoop.util.ReflectionUtils

  File: ReflectionUtils.java Line:117 method: newInstance
  class: org.apache.hadoop.util.ReflectionUtils

  File: RetryingRawStore.java Line:62 method:
  class: org.apache.hadoop.hive.metastore.RetryingRawStore

  File: RetryingRawStore.java Line:71 method: getProxy
  class: org.apache.hadoop.hive.metastore.RetryingRawStore

  File: HiveMetaStore.java Line:414 method: newRawStore
  class: org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler

  File: HiveMetaStore.java Line:402 method: getMS
  class: org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler

  File: HiveMetaStore.java Line:440 method: createDefaultDB
  class: org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler

  File: HiveMetaStore.java Line:326 method: init
  class: org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler

  File: HiveMetaStore.java Line:286 method:
  class: org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler

  File: RetryingHMSHandler.java Line:54 method:
  class: org.apache.hadoop.hive.metastore.RetryingHMSHandler

  File: RetryingHMSHandler.java Line:59 method: getProxy
  class: org.apache.hadoop.hive.metastore.RetryingHMSHandler

  File: HiveMetaStore.java Line:4183 method: newHMSHandler
  class: org.apache.hadoop.hive.metastore.HiveMetaStore

  File: HiveMetaStoreClient.java Line:121 method:
  class: org.apache.hadoop.hive.metastore.HiveMetaStoreClient

  File: HiveMetaStoreClient.java Line:104 method:
  class: org.apache.hadoop.hive.metastore.HiveMetaStoreClient

  File: org.apache.jsp.show_005fdatabases_jsp Line:54 method:
_jspService
  class: org.apache.jsp.show_005fdatabases_jsp

  File: HttpJspBase.java Line:97 method: service
  class: org.apache.jasper.runtime.HttpJspBase

  File: HttpServlet.java Line:820 method: service
  class: javax.servlet.http.HttpServlet

  File: JspServletWrapper.java Line:322 method: service
  class: org.apache.jasper.servlet.JspServletWrapper

  File: JspServlet.java Line:314 method: serviceJspFile
  class: org.apache.jasper.servlet.JspServlet

  File: JspServlet.java Line:264 method: service
  class: org.apache.jasper.servlet.JspServlet

 

Hive + mongoDB

2013-09-11 Thread Sandeep Nemuri
Hi every one ,
   I am trying to import data from mongodb to hive . i
got some jar files to connect mongo and hive .
now how to import the data from mongodb to hive ?

Thanks in advance.

-- 
--Regards
  Sandeep Nemuri


Re: Hive + mongoDB

2013-09-11 Thread Russell Jurney
The docs are at https://github.com/mongodb/mongo-hadoop/tree/master/hive

You need to build mongo-hadoop, and then use the documented syntax to
create BSON tables in Hive.


On Wed, Sep 11, 2013 at 11:11 AM, Jitendra Yadav jeetuyadav200...@gmail.com
 wrote:

 Hi,

 1. you may use Hadoop-mongodb connector, create a map reduce program
 to process your data from mongodb to hive.

 https://github.com/mongodb/mongo-hadoop


 2. As an alternative you can also use pig mongodb combination to get
 the data from mongodb through pig, then after you can create a table
 in hive that will points to the pig output file on hdfs.

 https://github.com/mongodb/mongo-hadoop/blob/master/pig/README.md

 Regards
 Jitendra
 On 9/11/13, Jérôme Verdier verdier.jerom...@gmail.com wrote:
  Hi,
 
  You can use Talend to import data from mongodb to hive
 
  More informations here : http://www.talend.com/products/big-data
 
 
  2013/9/11 Sandeep Nemuri nhsande...@gmail.com
 
  Hi every one ,
 I am trying to import data from mongodb to hive .
  i
  got some jar files to connect mongo and hive .
  now how to import the data from mongodb to hive ?
 
  Thanks in advance.
 
  --
  --Regards
Sandeep Nemuri
 
 
 
 
  --
  *Jérôme VERDIER*
  06.72.19.17.31
  verdier.jerom...@gmail.com
 




-- 
Russell Jurney twitter.com/rjurney russell.jur...@gmail.com datasyndrome.com


Re: Hive + mongoDB

2013-09-11 Thread Jitendra Yadav
Hi,

1. you may use Hadoop-mongodb connector, create a map reduce program
to process your data from mongodb to hive.

https://github.com/mongodb/mongo-hadoop


2. As an alternative you can also use pig mongodb combination to get
the data from mongodb through pig, then after you can create a table
in hive that will points to the pig output file on hdfs.

https://github.com/mongodb/mongo-hadoop/blob/master/pig/README.md

Regards
Jitendra
On 9/11/13, Jérôme Verdier verdier.jerom...@gmail.com wrote:
 Hi,

 You can use Talend to import data from mongodb to hive

 More informations here : http://www.talend.com/products/big-data


 2013/9/11 Sandeep Nemuri nhsande...@gmail.com

 Hi every one ,
I am trying to import data from mongodb to hive .
 i
 got some jar files to connect mongo and hive .
 now how to import the data from mongodb to hive ?

 Thanks in advance.

 --
 --Regards
   Sandeep Nemuri




 --
 *Jérôme VERDIER*
 06.72.19.17.31
 verdier.jerom...@gmail.com



Re: Hive + mongoDB

2013-09-11 Thread Jérôme Verdier
Hi,

You can use Talend to import data from mongodb to hive

More informations here : http://www.talend.com/products/big-data


2013/9/11 Sandeep Nemuri nhsande...@gmail.com

 Hi every one ,
I am trying to import data from mongodb to hive . i
 got some jar files to connect mongo and hive .
 now how to import the data from mongodb to hive ?

 Thanks in advance.

 --
 --Regards
   Sandeep Nemuri




-- 
*Jérôme VERDIER*
06.72.19.17.31
verdier.jerom...@gmail.com


Strange hive error

2013-09-11 Thread Siddharth Tiwari
Hi TeamI am getting following error when I am trying to load csv file in my 
hive table:-FAILED: Parse Error: line 1:71 character 'EOF' not supported here
Can you please explain whats this  error and its resolution ?

**

Cheers !!!

Siddharth Tiwari

Have a refreshing day !!!
Every duty is holy, and devotion to duty is the highest form of worship of 
God.” 

Maybe other people will try to limit me but I don't limit myself
  

Re: Hive throwing strange error

2013-09-11 Thread Nitin Pawar
Can  you provide your query?


On Thu, Sep 12, 2013 at 1:41 AM, Siddharth Tiwari siddharth.tiw...@live.com
 wrote:

 Hi Team

 I am getting following error when I am trying to load csv file in my hive
 table:-

 FAILED: Parse Error: line 1:71 character 'EOF' not supported here

 Can you please explain whats this  error and its resolution ?


 ****
 *Cheers !!!*
 *Siddharth Tiwari*
 Have a refreshing day !!!
 *Every duty is holy, and devotion to duty is the highest form of worship
 of God.” *
 *Maybe other people will try to limit me but I don't limit myself*




-- 
Nitin Pawar


Re: error executing the tutorial - https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely

2013-09-11 Thread Nitin Pawar
Did you try putting those values inside core-site.xml for hdfs?


On Tue, Sep 10, 2013 at 8:11 PM, shouvanik.hal...@accenture.com wrote:

  Hi Nitin,

 Latest update. Am able to start metastore and table getting created. But
 when I execute the command “hive select * from table_name”, hive compaints
 

 ** **

 Failed with exception
 java.io.IOException:java.lang.IllegalArgumentException: AWS Access Key ID
 and Secret Access Key must be specified as the username or password
 (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or
 fs.s3n.awsSecretAccessKey properties (respectively).

 Time taken: 1.158 seconds

 ** **

 ** **

 Thanks,

 Shouvanik

 ** **

 *From:* Haldar, Shouvanik
 *Sent:* Tuesday, September 10, 2013 5:04 PM
 *To:* Haldar, Shouvanik; user@hive.apache.org

 *Subject:* RE: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  ** **

 Sorry again. File system will be hadoop

 Sent from my Windows Phone
   --

 *From: *Haldar, Shouvanik shouvanik.hal...@accenture.com
 *Sent: *10-09-2013 16:59
 *To: *user@hive.apache.org
 *Subject: *RE: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

 No. File system will be s3n, or native file system or hadoop. But using
 HIVE we are storing data that is already existing inside s3

  

 *From:* Nitin Pawar [mailto:nitinpawar...@gmail.comnitinpawar...@gmail.com]

 *Sent:* Tuesday, September 10, 2013 4:56 PM
 *To:* user@hive.apache.org
 *Subject:* Re: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  

 Ok, so back to basics. 

  

 correct me If I am wrong

  

 You are trying to setup YARN using s3 as file system? 

 Then you want to use hive on top of it. 

  

  

 On Tue, Sep 10, 2013 at 4:48 PM, shouvanik.hal...@accenture.com wrote:**
 **

 The issue is if I paste it in core-site.xml of Hadoop on the node in
 question, then will starting datanode would do only? Actually,. I am a bit
 confused what all daemons to start. I have a YARN architecture running 1
 separate – Resource  Manager, 1 NameNode   4 datanodes.

  

 *From:* Nitin Pawar [mailto:nitinpawar...@gmail.com]
 *Sent:* Tuesday, September 10, 2013 4:43 PM


 *To:* user@hive.apache.org
 *Subject:* Re: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  

 There is not more to start. If hive is not starting properly, it will not
 start at all so you are doing ok. 

  

 Problem is when you create table, hive tries to create a directory with
 that name if its not existing. Just try to put those two settings inside
 core-site.xml and see if the hdfs can use it to create directory 

  

 On Tue, Sep 10, 2013 at 4:06 PM, shouvanik.hal...@accenture.com wrote:**
 **

 The values are showing when I run “hive set;” command. But it’s not
 getting identified. I guess I am not starting HIVE properly. Do you know
 what are the steps?

  

 *From:* Nitin Pawar [mailto:nitinpawar...@gmail.com]
 *Sent:* Tuesday, September 10, 2013 4:04 PM


 *To:* user@hive.apache.org
 *Subject:* Re: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  

 From the config file it looks ok. Not really sure why it is not picking up
 when you run it via hive cli. 

  

 Can you try putting these in core-site.xml of hdfs. 

  

  

 property

 namefs.s3n.awsAccessKeyId/name

 valueID/value

 /property

 property

 namefs.s3n.awsSecretAccessKey/name

 valueSECRET/value

 /property



 On Tue, Sep 10, 2013 at 3:11 PM, shouvanik.hal...@accenture.com wrote:**
 **

 Hi Nitin,

  

 PFA hive-site.xml. I wanted to start Hortonworks HIVE. Can you please help.
 

  

 Thanks,

 Shouvanik

  

 *From:* Nitin Pawar [mailto:nitinpawar...@gmail.com]
 *Sent:* Tuesday, September 10, 2013 11:15 AM


 *To:* user@hive.apache.org
 *Subject:* Re: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  

 if you have set these values inside hive-site.xml and restarted hive
 service. It should work. 

  

 Share your hive-site.xml (please remove values for s3naccess key and
 secret ) 

  

 On Tue, Sep 10, 2013 at 3:17 AM, shouvanik.hal...@accenture.com wrote:**
 **

 Please help. I am badly stuck on this. Please help.

  

 *From:* Haldar, Shouvanik
 *Sent:* Tuesday, September 10, 2013 1:03 AM


 *To:* 'user@hive.apache.org'
 *Subject:* RE: error executing the tutorial -
 https://cwiki.apache.org/confluence/display/Hive/HiveAws+HivingS3nRemotely
 

  

 When I set values 

Re: Question for ORCFileFormat

2013-09-11 Thread Owen O'Malley
The easiest way to use it is to use HCatalog, which enables you to read or
write ORC files from MapReduce or Pig.

-- Owen


On Mon, Sep 9, 2013 at 11:14 AM, Saptarshi Guha saptarshi.g...@gmail.comwrote:

 Hello,

 Are there any examples of writing using ORC as aFileOutputFormat (and then
 as a FileInputFormat) in MapReduce jobs? I was looking at the source of
 ORCFIleInput/OutputFormat but couldn't quite grok how to compose the
 ORCSerdes.

 Cheers
 Saptarshi




Hive throwing strange error

2013-09-11 Thread Siddharth Tiwari
Hi Team
I am getting following error when I am trying to load csv file in my hive 
table:-
FAILED: Parse Error: line 1:71 character 'EOF' not supported here
Can you please explain whats this  error and its resolution ?

**

Cheers !!!

Siddharth Tiwari

Have a refreshing day !!!
Every duty is holy, and devotion to duty is the highest form of worship of 
God.” 

Maybe other people will try to limit me but I don't limit myself
  

Re: Strange hive error

2013-09-11 Thread Mohammad Tariq
Could you please show us your query?

Warm Regards,
Tariq
cloudfront.blogspot.com


On Thu, Sep 12, 2013 at 1:49 AM, Siddharth Tiwari siddharth.tiw...@live.com
 wrote:

 Hi Team

 I am getting following error when I am trying to load csv file in my hive
 table:-

 FAILED: Parse Error: line 1:71 character 'EOF' not supported here

 Can you please explain whats this  error and its resolution ?


 ****
 *Cheers !!!*
 *Siddharth Tiwari*
 Have a refreshing day !!!
 *Every duty is holy, and devotion to duty is the highest form of worship
 of God.” *
 *Maybe other people will try to limit me but I don't limit myself*



Re: Question for ORCFileFormat

2013-09-11 Thread Saptarshi Guha
Hi,
Thanks, but assuming i can't use HCatalog, or integrating it is difficult,
is there an example of using
ORC as an outputformat in a mapreduce job?

Regards
Saptarshi



On Wed, Sep 11, 2013 at 1:36 PM, Owen O'Malley omal...@apache.org wrote:

 The easiest way to use it is to use HCatalog, which enables you to read or
 write ORC files from MapReduce or Pig.

 -- Owen


 On Mon, Sep 9, 2013 at 11:14 AM, Saptarshi Guha 
 saptarshi.g...@gmail.comwrote:

 Hello,

 Are there any examples of writing using ORC as aFileOutputFormat (and
 then as a FileInputFormat) in MapReduce jobs? I was looking at the source
 of ORCFIleInput/OutputFormat but couldn't quite grok how to compose the
 ORCSerdes.

 Cheers
 Saptarshi





Dynamic hbase columns

2013-09-11 Thread Marcos Sousa
Hi,

I'm tracking user actions. My column family has the list of objects ids
that received the action. The object id is the column qualifier and the
timestamp is the column value.

How should I map this table in hive and how can I filter by this two fields?

Thanks

Marcos Sousa