Hive Installation Issue

2013-07-05 Thread Ranjitha Chandrashekar
Hi

I'm trying to install HIVE on Hadoop version 1.0.2.

I use hive-0.11.0.tar.gz version. Hive successfully gets installed and when i 
try to run a HIVE command, say SHOW TABLES; i get the following error,

hive show tables;
Exception in thread main java.lang.NoSuchFieldError: type
at 
org.apache.hadoop.hive.ql.parse.HiveLexer.mKW_SHOW(HiveLexer.java:1305)
at 
org.apache.hadoop.hive.ql.parse.HiveLexer.mTokens(HiveLexer.java:6439)
at org.antlr.runtime.Lexer.nextToken(Lexer.java:84)
at 
org.antlr.runtime.CommonTokenStream.fillBuffer(CommonTokenStream.java:95)
at org.antlr.runtime.CommonTokenStream.LT(CommonTokenStream.java:238)
at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:573)
at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:439)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:416)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:335)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:893)
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Also there is no any other antlr jar in the class path except for the one 
provided by HIVE. I do not know where exactly is the issue.

Please suggest.

Thanks
Ranjitha.


::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.




RE: Hive Installation Issue

2013-07-05 Thread Ranjitha Chandrashekar
Hi Rekha

Thanks for the quick response.

Figured out the issue. This was because of the conflicting antlr jar.

This is caused due to wrong resolution of antlr runtime.
Instead of resolving from HIVE_HOME/lib/antlr*jar - it was resolving from 
HADOOP_HOME/lib/mahout-examples-0.7-job.jar

The conflicting antlr jar is removed from the classpath and it works fine now.

Thanks
Ranjitha.
From: Joshi, Rekha [mailto:rekha_jo...@intuit.com]
Sent: 05 July 2013 16:18
To: user@hive.apache.org
Subject: Re: Hive Installation Issue

Its not just the antlr, there must be no conflicting jars on HADOOP_CLASSPATH 
versus that required by your hive version.

Thanks
Rekha

From: Ranjitha Chandrashekar ranjitha...@hcl.commailto:ranjitha...@hcl.com
Reply-To: user@hive.apache.orgmailto:user@hive.apache.org 
user@hive.apache.orgmailto:user@hive.apache.org
Date: Friday 5 July 2013 2:52 PM
To: user@hive.apache.orgmailto:user@hive.apache.org 
user@hive.apache.orgmailto:user@hive.apache.org
Subject: Hive Installation Issue

Hi

I'm trying to install HIVE on Hadoop version 1.0.2.

I use hive-0.11.0.tar.gz version. Hive successfully gets installed and when i 
try to run a HIVE command, say SHOW TABLES; i get the following error,

hive show tables;
Exception in thread main java.lang.NoSuchFieldError: type
at 
org.apache.hadoop.hive.ql.parse.HiveLexer.mKW_SHOW(HiveLexer.java:1305)
at 
org.apache.hadoop.hive.ql.parse.HiveLexer.mTokens(HiveLexer.java:6439)
at org.antlr.runtime.Lexer.nextToken(Lexer.java:84)
at 
org.antlr.runtime.CommonTokenStream.fillBuffer(CommonTokenStream.java:95)
at org.antlr.runtime.CommonTokenStream.LT(CommonTokenStream.java:238)
at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:573)
at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:439)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:416)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:335)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:893)
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Also there is no any other antlr jar in the class path except for the one 
provided by HIVE. I do not know where exactly is the issue.

Please suggest.

Thanks
Ranjitha.


::DISCLAIMER::

The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.



Complex DataTypes in Hive

2013-04-10 Thread Ranjitha Chandrashekar
Hi

How does hive interpret the complex datatpes (my own class) in the sequence 
file, say i have my own complex datatype (my own class) as the value written in 
the Sequence File.
How do I make Hive read this, read it as corresponding columns.

For Ex. I have this sequence file writeen as hadoop writable on HDFS with 
records

100One, two, buckle my shoe
99  Three, four, shut the door
98  Five, six, pick up sticks

Now in Hive, i create a external table,

1.create external table seqExampleMap (number mapSTRING,STRING) row format 
delimited fields terminated by ' ' STORED AS SEQUENCEFILE location 
'/user/hiveFiles/';

How do I say that One is the key and two, buckle my shoe is the value in 
the map number created above?

Please suggest

Thanks
Ranjitha.





::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.




Problem creating external table to a sequence file on HDFS

2013-04-08 Thread Ranjitha Chandrashekar
Hi

I am trying to create a external table to a sequence file on HDFS.

I have my own input format and a SerDe, which is compiled into a jar and added 
in HIVE. Inspite of doing this, I get the following error.

Please Suggest.

hive create table seq (key STRING, value STRING) ROW FORMAT SERDE 
'com.org.SequenceFileKeyRecordReader' STORED AS INPUTFORMAT 
'com.org.SequenceFileKeyInputFormat' OUTPUTFORMAT 
'org.apache.hadoop.mapred.SequenceFileOutputFormat' location '/user/hiveFiles/';
FAILED: Error in metadata: Cannot validate serde: 
com.org.SequenceFileKeyRecordReader
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask
hive

Thanks
Ranjitha.


::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.




Error Creating External Table

2013-04-05 Thread Ranjitha Chandrashekar
Hi

When i try creating a external table to a text file on HDFS i get the following 
error. Could someone please let me know where I am going wrong.

 hive create external table seq4 (item STRING) row format delimited fields 
terminated by '' STORED as TEXTFILE location 
'hdfs://host:54310/user/myfolder/items';
FAILED: Error in metadata: MetaException(message:Got exception: 
org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent 
path is not a directory: /user/myfolder/items
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:944)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2068)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
)
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask

Thanks
Ranjitha.


::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.




RE: External Table to Sequence File on HDFS

2013-04-05 Thread Ranjitha Chandrashekar
Hi Sanjay

Thank you for the quick response.

I got the input format part from the link that u sent. But in order to read 
that table in Hive, i need to specify the SerDe, where exactly do I specify 
this class file.

Is it something like,

create table seq10 (key STRING, value STRING) ROW FORMAT SERDE 
'com.org.SequenceFileKeyRecordReader' STORED AS INPUTFORMAT 
'com.org.SequenceFileKeyInputFormat' OUTPUTFORMAT 
'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat' LOCATION hdfs path;

Is this the right approach where i specify the input file format and write a 
custom serDe too..?

Please correct me if I am wrong

Thanks
Ranjitha.

From: Sanjay Subramanian [mailto:sanjay.subraman...@wizecommerce.com]
Sent: 04 April 2013 11:41
To: user@hive.apache.org
Subject: Re: External Table to Sequence File on HDFS

Check this out
http://stackoverflow.com/questions/13203770/reading-hadoop-sequencefiles-with-hive

From: Ranjitha Chandrashekar ranjitha...@hcl.commailto:ranjitha...@hcl.com
Reply-To: user@hive.apache.orgmailto:user@hive.apache.org 
user@hive.apache.orgmailto:user@hive.apache.org
Date: Wednesday, April 3, 2013 10:43 PM
To: user@hive.apache.orgmailto:user@hive.apache.org 
user@hive.apache.orgmailto:user@hive.apache.org
Subject: External Table to Sequence File on HDFS


Hi



I want to create a external hive table to a sequence file(each record - key 
value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::

The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.


CONFIDENTIALITY NOTICE
==
This email message and any attachments are for the exclusive use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure or distribution is prohibited. If you 
are not the intended recipient, please contact the sender by reply email and 
destroy all copies of the original message along with any attachments, from 
your computer system. If you are the intended recipient, please be advised that 
the content of this message is subject to access, review and disclosure by the 
sender's Email System Administrator.


RE: Error Creating External Table

2013-04-05 Thread Ranjitha Chandrashekar
Hi Piyush

That was a problem with the path. There were other incompatible files in that 
directory.

Thanks anyway.. :)

From: Piyush Srivastava [mailto:piyush.srivast...@wizecommerce.com]
Sent: 05 April 2013 15:23
To: user@hive.apache.org
Subject: RE: Error Creating External Table

When you giving location give it as '/user/myfolder/items' hive know it is need 
to be store at HDFS which is define at $HADOOP_CONF_DIR/hdfs-site.xml.

Thanks,
./Piyush

From: Ranjitha Chandrashekar [ranjitha...@hcl.com]
Sent: Friday, April 05, 2013 3:16 PM
To: user@hive.apache.org
Subject: Error Creating External Table
Hi

When i try creating a external table to a text file on HDFS i get the following 
error. Could someone please let me know where I am going wrong.

 hive create external table seq4 (item STRING) row format delimited fields 
terminated by '' STORED as TEXTFILE location 
'hdfs://host:54310/user/myfolder/items';
FAILED: Error in metadata: MetaException(message:Got exception: 
org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent 
path is not a directory: /user/myfolder/items
at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:944)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2068)
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
)
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask

Thanks
Ranjitha.


::DISCLAIMER::

The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.


CONFIDENTIALITY NOTICE
==
This email message and any attachments are for the exclusive use of the 
intended recipient(s) and may contain confidential and privileged information. 
Any unauthorized review, use, disclosure or distribution is prohibited. If you 
are not the intended recipient, please contact the sender by reply email and 
destroy all copies of the original message along with any attachments, from 
your computer system. If you are the intended recipient, please be advised that 
the content of this message is subject to access, review and disclosure by the 
sender's Email System Administrator.


External Table to Sequence File on HDFS

2013-04-03 Thread Ranjitha Chandrashekar
Hi



I want to create a external hive table to a sequence file(each record - key 
value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.




External hive table to a Sequence file

2013-04-01 Thread Ranjitha Chandrashekar
Hi



I want to create a external hive table to a sequence file(each record - key 
value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.