Re: Alter table is giving error

2012-11-06 Thread Mark Grover
Chunky, I just tried it myself. It turns out that the directory you are adding as partition has to be empty for msck repair to work. This is obviously sub-optimal and there is a JIRA in place ( https://issues.apache.org/jira/browse/HIVE-3231) to fix it. So, I'd suggest you keep an eye out for the

Re: Alter table is giving error

2012-11-06 Thread Chunky Gupta
Hi Mark, Sorry, I forgot to mention. I have also tried msck repair table ; and same output I got which I got from msck only. Do I need to do any other settings for this to work, because I have prepared Hadoop and Hive setup from start on EC2. Thanks, Chunky. On Wed, Nov 7, 2012

Re: Alter table is giving error

2012-11-06 Thread Mark Grover
Chunky, You should have run: msck repair table ; Sorry, I should have made it clear in my last reply. I have added an entry to Hive wiki for benefit of others: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Recoverpartitions Mark On Tue, Nov 6, 2012 at 9:5

Re: Hbase connection closed when query multiple complicated hql with hive+hbase integration

2012-11-06 Thread Cheng Su
The exceptions seem to be another problem. They all happened on one node. And after the task attempts failed at that node, retried on other nodes and no exceptions. So that, the exception maybe have nothing to do with the performance issue. On Wed, Nov 7, 2012 at 11:07 AM, Cheng Su wrote: > Hi, a

Re: TRANSFORM + LATERAL VIEW?

2012-11-06 Thread Mark Grover
Jamie, Not that I know of. Assuming you will be using LATERAL VIEW for exploding the data, I can think of 2 options at the top of my head: 1. Pass 'id' column to your transform script. You will have to take care of the exploding data in your transform script. It would no longer be a simple 'cat'.

Re: Alter table is giving error

2012-11-06 Thread Chunky Gupta
Hi Mark, I didn't get any error. I ran this on hive console:- "msck table Table_Name;" It says Ok and showed the execution time as 1.050 sec. But when I checked partitions for table using "show partitions Table_Name;" It didn't show me any partitions. Thanks, Chunky. On Tue, No

RE: Which is the postgres version is compatible for hive-trunk..?

2012-11-06 Thread rohithsharma
Thanks for you reply.. I will use 9.0.x version. -Original Message- From: Alexander Lorenz [mailto:wget.n...@gmail.com] Sent: Tuesday, November 06, 2012 3:34 PM To: rohithsharm...@huawei.com Cc: user@hive.apache.org Subject: Re: Which is the postgres version is compatible for hive-trunk..

Re: hive integrate with hbase, map to existed hbase table report column family not exist

2012-11-06 Thread Mark Grover
Indeed. https://issues.apache.org/jira/browse/HIVE-3243 Sorry you found out about it the hard way! On Tue, Nov 6, 2012 at 5:46 PM, Chris Gong wrote: > ** > i got the reason, the column mapping section can't have any white space, > including \r\n > > -- > Chris Gong

Re: ClassNotFoundException when use hive java client of hive + hbase integration

2012-11-06 Thread Cheng Su
This doesn't work. In CLI mode you could export the environment variable to avoid add jar every time. I did this, but still encounter the error when I access from java client. And I can't even specify the --auxpath param when you start a hive thrift service. So at least in my situation, I have to a

回复: hive integrate with hbase, map to existed hbase table report column family not exist

2012-11-06 Thread Chris Gong
i got the reason, the column mapping section can't have any white space, including \r\n Chris Gong 发件人: Chris Gong 发送时间: 2012-11-06 10:56 收件人: user-hive 主题: hive integrate with hbase, map to existed hbase table report column family not exist hi all: now, I'm map to an existed hbase tabl

Re: Hive compression with external table

2012-11-06 Thread Bejoy KS
Hi Krishna Sequence Files + Snappy compressed would be my recommendation as well. It can be processed by managed as well as external tables. There is no difference in storage formats for managed and external tables. Also this can be consumed by mapred or pig directly. Regards Bejoy KS Sent

Re: Alter table is giving error

2012-11-06 Thread Mark Grover
Glad to hear, Chunky. Out of curiosity, what errors did you get when using msck? On Tue, Nov 6, 2012 at 5:14 AM, Chunky Gupta wrote: > Hi Mark, > I tried msck, but it is not working for me. I have written a python script > to partition the data individually. > > Thank you Edward, Mark and Dean.

Re: Alter table is giving error

2012-11-06 Thread Chunky Gupta
Hi Mark, I tried msck, but it is not working for me. I have written a python script to partition the data individually. Thank you Edward, Mark and Dean. Chunky. On Mon, Nov 5, 2012 at 11:08 PM, Mark Grover wrote: > Chunky, > I have used "recover partitions" command on EMR, and that worked fine.

Re: Which is the postgres version is compatible for hive-trunk..?

2012-11-06 Thread t...@postgresql.hk
Hi, Some major differences between these versions are: PostgreSQL 9.0 - Streaming Replication and Hot Standby (latest is 9.0.10) PostgreSQL 9.1 - Synchronous Replication (latest is 9.1.6) PostgreSQL 9.2 - Cascading Replication (latest is 9.2.1) PostgreSQL 9.3 - (not yet release version: Multiple

Re: Which is the postgres version is compatible for hive-trunk..?

2012-11-06 Thread Alexander Lorenz
Hi, I don't know what exactly was the major changes between 9.0.x and 9.1.x, but when you've a good experience with 9.0.x, use it :) For 9.1.x, it would be great to get a log or stack trace to see whats going wrong there. best, Alex On Nov 6, 2012, at 9:53 AM, rohithsharma wrote: > Hi Loren

Re: Hive compression with external table

2012-11-06 Thread Krishna Rao
Thanks for the reply. Compressed sequence files with compression might work. However, it's not clear to me if it's possible to read Sequence files using an external table. On 5 November 2012 16:04, Edward Capriolo wrote: > Compression is a confusing issue. Sequence files that are in block > form

RE: Which is the postgres version is compatible for hive-trunk..?

2012-11-06 Thread rohithsharma
Hi Lorenz, Thanks for you fast reply :-) >> Postgres 9x I personally wouldn't choose. Can you please tell what is the problem using 9x version with hive.? I tried with 9.0.7, it was successfully integrated and functionality also fine. But when I changed to 9.1, basic "show table" in hive failed