Hive over HBase Integration Issue
Hi all, In Hive over HBase case, how to configure and get Hive as HBase client to reference hbase-site.xml (for example the one in /etc/hbase/conf)? In this case it's necessary to do that since there're some specific and important client configurations for all HBase clients including Hive, and such configuration shouldn't be replicated/copied into hive-site.xml right. Thanks for your info. Regards, Kai
Re: Hive over HBase Integration Issue
You could replicate the configs or place the client-sided hbase-site.xml in the same directory as hive-site.xml for it to get picked up via the classpath by HBase centric classes. On Sun, Feb 17, 2013 at 2:41 PM, Zheng, Kai kai.zh...@intel.com wrote: Hi all, In Hive over HBase case, how to configure and get Hive as HBase client to reference hbase-site.xml (for example the one in /etc/hbase/conf)? In this case it’s necessary to do that since there’re some specific and important client configurations for all HBase clients including Hive, and such configuration shouldn’t be replicated/copied into hive-site.xml right. Thanks for your info. Regards, Kai -- Harsh J
Re: Hive Queries
Dude sorry for the off topic, but having a rocketmail account is awesome. I wish I still had mine. On Sat, Feb 16, 2013 at 9:16 PM, manishbh...@rocketmail.com manishbh...@rocketmail.com wrote: When you want to move data from external system to hive, this means moving data to HDFS first and then point the Hive table to the file in HDFS where you have exported the data. So, you have couple of commands like -copyFromLocal and fget which move the file to hdfs. If you intent to move in real time fashion try Flume. But end of the day the data movement first happens in HDFS and then hive table can be loaded using Load table command. Regards, Manish Bhoge sent by HTC device. Excuse typo. - Reply message - From: Cyrille Djoko c...@agnik.com To: user@hive.apache.org Subject: Hive Queries Date: Sat, Feb 16, 2013 1:50 AM Hi Jarcec, I did try Sqoop. I am running sqoop 1.4.2 --hadoop1.0.0 along with hadoop 1.0.4 But I keep running on the following exception. Exception in thread main java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.JobContext, but interface was expected So I wrote a small program but all I can do is send queries to the server. Hi Cyrille, I'm not exactly sure what exactly you mean, so I'm more or less blindly shooting, but maybe Apache Sqoop [1] might help you? Jarcec Links: 1: http://sqoop.apache.org/ On Fri, Feb 15, 2013 at 01:44:45PM -0500, Cyrille Djoko wrote: I am looking for a relatively efficient way of transferring data between a remote server and Hive without going through the hassle of storing the data first on memory before loading it to Hive. From what I have read so far there is no such command but it would not hurt to ask. Is it possible to insert data through an insert query in hive? (The equivalent to insert into table_name values (...) in xSQLx) Thank you in advance for an answer. Cyrille Djoko Data Mining Developer Intern Cyrille Djoko Agnik LLC Data Mining Developer Intern
RE: Hive over HBase Integration Issue
Hi Harsh J, Thanks for your replying. It looks good to me to place the client-sided hbase-site.xml in the same directory as hive-site.xml. I already tried this approach before but didn't work. I will investigate this way wondering if there're some gaps here or possibly It was just caused due to my environment (like Hive or Hbase version). Regards, Kai -Original Message- From: Harsh J [mailto:ha...@cloudera.com] Sent: Sunday, February 17, 2013 10:10 PM To: hive request Cc: u...@hbase.apache.org; Andrew Purtell apurt...@apache.org (apurt...@apache.org) Subject: Re: Hive over HBase Integration Issue You could replicate the configs or place the client-sided hbase-site.xml in the same directory as hive-site.xml for it to get picked up via the classpath by HBase centric classes. On Sun, Feb 17, 2013 at 2:41 PM, Zheng, Kai kai.zh...@intel.com wrote: Hi all, In Hive over HBase case, how to configure and get Hive as HBase client to reference hbase-site.xml (for example the one in /etc/hbase/conf)? In this case it's necessary to do that since there're some specific and important client configurations for all HBase clients including Hive, and such configuration shouldn't be replicated/copied into hive-site.xml right. Thanks for your info. Regards, Kai -- Harsh J
Re: 0.8.0 - 0.9.0 mysql schema upgrade
Hi Sam William: Check this issue: https://issues.apache.org/jira/browse/HIVE-3649 and http://svn.apache.org/repos/asf/hive/tags/release-0.10.0/metastore/scripts/upgrade/mysql/upgrade-0.9.0-to-0.10.0.mysql.sql 2013/1/5 Sam William sa...@stumbleupon.com Looks like this column is not even there in the 0.8/0.9 schema files . I have no idea, how I have it in my schema . I just set a default 'false' value and I m fine now. Sam On Jan 4, 2013, at 2:22 PM, Sam William sa...@stumbleupon.com wrote: When I upgraded to 0.9.0, Im getting an exception when I try to create tables FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4774e78a using statement INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`LOCATION`,`INPUT_FORMAT`,`CD_ID`,`OUTPUT_FORMAT`,`SERDE_ID`,`IS_COMPRESSED`) VALUES (?,?,?,?,?,?,?,?) failed : Field 'IS_STOREDASSUBDIRECTORIES' doesn't have a default value NestedThrowables: java.sql.SQLException: Field 'IS_STOREDASSUBDIRECTORIES' doesn't have a default value The upgrade script from 0.8 to 0.9 doesnt have anything ? What am I missing ? Sam William sa...@stumbleupon.com Sam William sa...@stumbleupon.com -- Best wishs! Fangkun.Cao