Re: Hive Query Error

2015-07-09 Thread Nitin Pawar
can u check your config?
host appears twice 01hw357381.tcsgegdc.com: 01hw357381.tcsgegdc.com
it shd be hostname:port

also once you correct this, you do a nslookup on the host to make sure its
identified by the hive client

On Thu, Jul 9, 2015 at 7:19 PM, Ajeet O ajee...@tcs.com wrote:

 Hi All , I have installed Hadoop 2.0 ,  Hive 0.12  on Cent OS 7.

 When I run a query  in Hive -  select count(*)  from u_data ;  it gives
 following errors.   , However I can run  select  * from u_data ;  pls help.

 hive select count(*) from u_data;
 Total MapReduce jobs = 1
 Launching Job 1 out of 1
 Number of reduce tasks determined at compile time: 1
 In order to change the average load for a reducer (in bytes):
   set hive.exec.reducers.bytes.per.reducer=number
 In order to limit the maximum number of reducers:
   set hive.exec.reducers.max=number
 In order to set a constant number of reducers:
   set mapred.reduce.tasks=number
 java.net.UnknownHostException: 01hw357381.tcsgegdc.com:
 01hw357381.tcsgegdc.com: unknown error
 at java.net.InetAddress.getLocalHost(InetAddress.java:1484)
 at
 org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:439)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:422)
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
 at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
 at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:422)
 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 at
 org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
 at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
 at
 org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
 at
 org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144)
 at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
 at
 org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
 at
 org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
 at
 org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
 at
 org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
 at
 org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
 at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:497)
 at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
 Caused by: java.net.UnknownHostException: 01hw357381.tcsgegdc.com:
 unknown error
 at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
 at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:907)
 at
 java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1302)
 at java.net.InetAddress.getLocalHost(InetAddress.java:1479)
 ... 34 more
 Job Submission failed with exception 'java.net.UnknownHostException(
 01hw357381.tcsgegdc.com: 01hw357381.tcsgegdc.com: unknown error)'
 FAILED: Execution Error, return code 1 from
 org.apache.hadoop.hive.ql.exec.mr.MapRedTask

 Thanks
 Ajeet

 =-=-=
 Notice: The information contained in this e-mail
 message and/or attachments to it may contain
 confidential or privileged information. If you are
 not the intended recipient, any dissemination, use,
 review, distribution, printing or copying of the
 information contained in this e-mail message
 and/or attachments to it are strictly prohibited. If
 you have received this communication in error,
 please notify us by reply e-mail or telephone and
 immediately and permanently delete the message
 and any attachments. Thank you




-- 
Nitin Pawar


Re: Hive Query Error

2015-07-09 Thread Ajeet O
Hi Nitin , How to check this,  you mean to check  hive-site.xml. please 
let me know how to check this.
 




From:   Nitin Pawar nitinpawar...@gmail.com
To: user@hive.apache.org user@hive.apache.org
Date:   07/09/2015 07:35 PM
Subject:Re: Hive Query Error



can u check your config? 
host appears twice 01hw357381.tcsgegdc.com: 01hw357381.tcsgegdc.com
it shd be hostname:port 

also once you correct this, you do a nslookup on the host to make sure its 
identified by the hive client 

On Thu, Jul 9, 2015 at 7:19 PM, Ajeet O ajee...@tcs.com wrote:
Hi All , I have installed Hadoop 2.0 ,  Hive 0.12  on Cent OS 7. 

When I run a query  in Hive -  select count(*)  from u_data ;  it gives 
following errors.   , However I can run  select  * from u_data ;  pls 
help. 

hive select count(*) from u_data; 
Total MapReduce jobs = 1 
Launching Job 1 out of 1 
Number of reduce tasks determined at compile time: 1 
In order to change the average load for a reducer (in bytes): 
  set hive.exec.reducers.bytes.per.reducer=number 
In order to limit the maximum number of reducers: 
  set hive.exec.reducers.max=number 
In order to set a constant number of reducers: 
  set mapred.reduce.tasks=number 
java.net.UnknownHostException: 01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error 
at java.net.InetAddress.getLocalHost(InetAddress.java:1484) 
at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:439)
 

at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296) 
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293) 
at java.security.AccessController.doPrivileged(Native Method) 
at javax.security.auth.Subject.doAs(Subject.java:422) 
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 

at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293) 
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) 
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) 
at java.security.AccessController.doPrivileged(Native Method) 
at javax.security.auth.Subject.doAs(Subject.java:422) 
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
 

at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557) 
at 
org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548) 
at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425) 
at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144) 
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151) 
at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65) 

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414) 
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192) 
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020) 
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888) 
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) 
at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) 
at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) 
at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781) 
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) 
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 

at java.lang.reflect.Method.invoke(Method.java:497) 
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Caused by: java.net.UnknownHostException: 01hw357381.tcsgegdc.com: unknown 
error 
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method) 
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:907) 
at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1302) 
at java.net.InetAddress.getLocalHost(InetAddress.java:1479) 
... 34 more 
Job Submission failed with exception 'java.net.UnknownHostException(
01hw357381.tcsgegdc.com: 01hw357381.tcsgegdc.com: unknown error)' 
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask 

Thanks 
Ajeet

=-=-=
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it 

Hive Query Error

2015-07-09 Thread Ajeet O
Hi All , I have installed Hadoop 2.0 ,  Hive 0.12  on Cent OS 7. 

When I run a query  in Hive -  select count(*)  from u_data ;  it gives 
following errors.   , However I can run  select  * from u_data ;  pls 
help.

hive select count(*) from u_data;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=number
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=number
In order to set a constant number of reducers:
  set mapred.reduce.tasks=number
java.net.UnknownHostException: 01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1484)
at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:439)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at 
org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.UnknownHostException: 01hw357381.tcsgegdc.com: unknown 
error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:907)
at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1302)
at java.net.InetAddress.getLocalHost(InetAddress.java:1479)
... 34 more
Job Submission failed with exception 
'java.net.UnknownHostException(01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask

Thanks
Ajeet

=-=-=
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you




Getting KryoException

2015-07-09 Thread Anupam sinha
Hi Guys,

i have getting exception thorwn during table creation in Hive


* Error: java.lang.RuntimeException:
org.apache.hive.com/esotericsoftware.kryo.KryoException
http://org.apache.hive.com/esotericsoftware.kryo.KryoException:
Encountered unregistered class ID: 380*

used : CDH : 5.2.1
Hive: Hive 0.13.1-cdh5.2.1

Could you please help/suggest me to overcome from this KryoException.

Thanks


RE: Add file/jar location for jdbc clients

2015-07-09 Thread Lonikar, Kiran
Never mind... It works for HDFS. I can copy to HDFS and specify the HDFS file 
path as hdfs://namenode/path/to/file.

-Kiran

-Original Message-
From: Lonikar, Kiran [mailto:kloni...@informatica.com] 
Sent: Wednesday, July 08, 2015 9:01 PM
To: user@hive.apache.org
Subject: Add file/jar location for jdbc clients

Hi,

The add jar/file syntax stated on hive language manual is as below:

ADD { FILE[S] | JAR[S] | ARCHIVE[S] } filepath1 [filepath2]*

I have an application which remotely connects to hiveserver2 over jdbc. It 
needs to add local resources before execution of any hive queries. When it 
tries to execute the above add jar or add file commands, with the filepath 
being from the jdbc client's local filesystem, it gets an error. But when it 
gives path local to hiveserver2 filesystem, it works.

What are my options here? One obvious one is to do scp to hiveserver's 
filesystem to say /tmp and then do add jar.

The hive 1.2.0 allows ivy urls. I dont think i can use that since my 
application generates the files and i don't want to setup ivy repo where i will 
need to copy the files before running add jar.

Isn't there any way to specify client's filesystem paths?

-Kiran


RE: Urgent : Issue with hive installation on Redhat linux 64bit

2015-07-09 Thread Payal Radheshamji Agrawal
Hi Ravi

Run “hive -hiveconf hive.root.logger=DEBUG,console” to start hive debug mode.

Thanks
Payal

From: Ravi Kumar Jain 03 [mailto:ravi_jai...@infosys.com]
Sent: Thursday, July 09, 2015 11:29 AM
To: user@hive.apache.org
Subject: RE: Urgent : Issue with hive installation on Redhat linux 64bit

Hi,

We are only using JDK1.7 with hive 1.2.0. As per the below link, hive 1.2 is 
supported by JDK1.7.
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-Requirements

[cid:image001.png@01D0BA43.9B73B110]


Following is output while executing hive in debug mode and nothing is coming on 
the prompt further :
[cid:image002.png@01D0BA43.9B73B110]


Regards,
Ravi Jain
Infosys Limited

From: Owen O'Malley [mailto:omal...@apache.org]
Sent: Thursday, July 09, 2015 10:47 AM
To: user@hive.apache.orgmailto:user@hive.apache.org
Subject: Re: Urgent : Issue with hive installation on Redhat linux 64bit

Based on the answer here:

http://stackoverflow.com/a/1096159/2301201

You must be trying to use a jdk older than java 1.7. Run the hive script with 
bash debugging turned on to see which jdk it is using.

.. Owen

On Wed, Jul 8, 2015 at 9:56 PM, Ravi Kumar Jain 03 
ravi_jai...@infosys.commailto:ravi_jai...@infosys.com wrote:
Hello All,

We are facing following issue while running hive on linux operating system :

Exception in thread main java.lang.UnsupportedClassVersionError: 
org/apache/hadoop/hive/cli/CliDriver : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:296)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)


Environment Details :
Java : Jdk1.7.0_11
Hadoop : 2.6
Hive : 1.2.0

Kindly provide us the pointers to resolve the issue asap.

Regards,
Ravi Jain
Infosys Limited



 CAUTION - Disclaimer *

This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely

for the use of the addressee(s). If you are not the intended recipient, please

notify the sender by e-mail and delete the original message. Further, you are 
not

to copy, disclose, or distribute this e-mail or its contents to any other 
person and

any such actions are unlawful. This e-mail may contain viruses. Infosys has 
taken

every reasonable precaution to minimize this risk, but is not liable for any 
damage

you may sustain as a result of any virus in this e-mail. You should carry out 
your

own virus checks before opening the e-mail or attachment. Infosys reserves the

right to monitor and review the content of all messages sent to or from this 
e-mail

address. Messages sent to or from this e-mail address may be stored on the

Infosys e-mail system.

***INFOSYS End of Disclaimer INFOSYS***




RE: Urgent : Issue with hive installation on Redhat linux 64bit

2015-07-09 Thread Ravi Kumar Jain 03
Hi,

We are only using JDK1.7 with hive 1.2.0. As per the below link, hive 1.2 is 
supported by JDK1.7.
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-Requirements

[cid:image001.png@01D0BA36.D897C610]


Following is output while executing hive in debug mode and nothing is coming on 
the prompt further :
[cid:image002.png@01D0BA3A.7BE921D0]


Regards,
Ravi Jain
Infosys Limited

From: Owen O'Malley [mailto:omal...@apache.org]
Sent: Thursday, July 09, 2015 10:47 AM
To: user@hive.apache.org
Subject: Re: Urgent : Issue with hive installation on Redhat linux 64bit

Based on the answer here:

http://stackoverflow.com/a/1096159/2301201

You must be trying to use a jdk older than java 1.7. Run the hive script with 
bash debugging turned on to see which jdk it is using.

.. Owen

On Wed, Jul 8, 2015 at 9:56 PM, Ravi Kumar Jain 03 
ravi_jai...@infosys.commailto:ravi_jai...@infosys.com wrote:
Hello All,

We are facing following issue while running hive on linux operating system :

Exception in thread main java.lang.UnsupportedClassVersionError: 
org/apache/hadoop/hive/cli/CliDriver : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:296)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)


Environment Details :
Java : Jdk1.7.0_11
Hadoop : 2.6
Hive : 1.2.0

Kindly provide us the pointers to resolve the issue asap.

Regards,
Ravi Jain
Infosys Limited



 CAUTION - Disclaimer *

This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely

for the use of the addressee(s). If you are not the intended recipient, please

notify the sender by e-mail and delete the original message. Further, you are 
not

to copy, disclose, or distribute this e-mail or its contents to any other 
person and

any such actions are unlawful. This e-mail may contain viruses. Infosys has 
taken

every reasonable precaution to minimize this risk, but is not liable for any 
damage

you may sustain as a result of any virus in this e-mail. You should carry out 
your

own virus checks before opening the e-mail or attachment. Infosys reserves the

right to monitor and review the content of all messages sent to or from this 
e-mail

address. Messages sent to or from this e-mail address may be stored on the

Infosys e-mail system.

***INFOSYS End of Disclaimer INFOSYS***




Re: alter table add column

2015-07-09 Thread Noam Hasson
If you want to add data to an already existing rows you'll have to insert
the data again by using insert overwrite, perhaps it's better to insert
into a new table.

On Wed, Jul 8, 2015 at 11:57 PM, Mona Meena dr@hotmail.com wrote:

 Hi,

 I have a partitioned table. Is it possible to alter this table by adding a
 new column and also update the table by inserting data into the new column?
 I know how to add a new column but no idea how to insert data into the new
 column. Any suggestions please

 BR,
 Mona


-- 
This e-mail, as well as any attached document, may contain material which 
is confidential and privileged and may include trademark, copyright and 
other intellectual property rights that are proprietary to Kenshoo Ltd, 
 its subsidiaries or affiliates (Kenshoo). This e-mail and its 
attachments may be read, copied and used only by the addressee for the 
purpose(s) for which it was disclosed herein. If you have received it in 
error, please destroy the message and any attachment, and contact us 
immediately. If you are not the intended recipient, be aware that any 
review, reliance, disclosure, copying, distribution or use of the contents 
of this message without Kenshoo's express permission is strictly prohibited.


RE: Urgent : Issue with hive installation on Redhat linux 64bit

2015-07-09 Thread Ravi Kumar Jain 03
Hello Payal,

While executing following command for Debug mode, I am still getting the same 
error(“java.lang.UnsupportedClassVersionError”) only.

When I see the Menifest file for the hive-cli-1.2.0.jar file, it contains 
1.7.0_79 version in it. And I am using 1.7.0_11 java version. Is might be the 
reason of this error but I think this error is thrown only if Jdk version is 
mismatch not due to the subversion mismatch.

Regards,
Ravi Jain

From: Payal Radheshamji Agrawal [mailto:payal.agra...@datametica.com]
Sent: Thursday, July 09, 2015 12:37 PM
To: user@hive.apache.org
Subject: RE: Urgent : Issue with hive installation on Redhat linux 64bit

Hi Ravi

Run “hive -hiveconf hive.root.logger=DEBUG,console” to start hive debug mode.

Thanks
Payal

From: Ravi Kumar Jain 03 [mailto:ravi_jai...@infosys.com]
Sent: Thursday, July 09, 2015 11:29 AM
To: user@hive.apache.orgmailto:user@hive.apache.org
Subject: RE: Urgent : Issue with hive installation on Redhat linux 64bit

Hi,

We are only using JDK1.7 with hive 1.2.0. As per the below link, hive 1.2 is 
supported by JDK1.7.
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-Requirements

[cid:image001.png@01D0BA43.9B73B110]


Following is output while executing hive in debug mode and nothing is coming on 
the prompt further :
[cid:image002.png@01D0BA43.9B73B110]


Regards,
Ravi Jain
Infosys Limited

From: Owen O'Malley [mailto:omal...@apache.org]
Sent: Thursday, July 09, 2015 10:47 AM
To: user@hive.apache.orgmailto:user@hive.apache.org
Subject: Re: Urgent : Issue with hive installation on Redhat linux 64bit

Based on the answer here:

http://stackoverflow.com/a/1096159/2301201

You must be trying to use a jdk older than java 1.7. Run the hive script with 
bash debugging turned on to see which jdk it is using.

.. Owen

On Wed, Jul 8, 2015 at 9:56 PM, Ravi Kumar Jain 03 
ravi_jai...@infosys.commailto:ravi_jai...@infosys.com wrote:
Hello All,

We are facing following issue while running hive on linux operating system :

Exception in thread main java.lang.UnsupportedClassVersionError: 
org/apache/hadoop/hive/cli/CliDriver : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:296)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)


Environment Details :
Java : Jdk1.7.0_11
Hadoop : 2.6
Hive : 1.2.0

Kindly provide us the pointers to resolve the issue asap.

Regards,
Ravi Jain
Infosys Limited



 CAUTION - Disclaimer *

This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely

for the use of the addressee(s). If you are not the intended recipient, please

notify the sender by e-mail and delete the original message. Further, you are 
not

to copy, disclose, or distribute this e-mail or its contents to any other 
person and

any such actions are unlawful. This e-mail may contain viruses. Infosys has 
taken

every reasonable precaution to minimize this risk, but is not liable for any 
damage

you may sustain as a result of any virus in this e-mail. You should carry out 
your

own virus checks before opening the e-mail or attachment. Infosys reserves the

right to monitor and review the content of all messages sent to or from this 
e-mail

address. Messages sent to or from this e-mail address may be stored on the

Infosys e-mail system.

***INFOSYS End of Disclaimer INFOSYS***




RE: Add file/jar location for jdbc clients

2015-07-09 Thread Ajeet O
Hi All
When I am trying to run a select count(*) from table_name;  getting an 
error as given below..what could be the reason  , pls help.

hive select count(*) from u_data ;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=number
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=number
In order to set a constant number of reducers:
  set mapred.reduce.tasks=number
java.net.UnknownHostException: 01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1484)
at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:439)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at 
org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.UnknownHostException: 01hw357381.tcsgegdc.com: unknown 
error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:907)
at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1302)
at java.net.InetAddress.getLocalHost(InetAddress.java:1479)
... 34 more
Job Submission failed with exception 
'java.net.UnknownHostException(01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
hive



Regards
Ajeet Ojha
Tata Consultancy Services
Ph:- 0120 6744953
Buzz:- 412-4953
Cell:- 9811220828
Mailto: ajee...@tcs.com
Website: http://www.tcs.com

Experience certainty.   IT Services
Business Solutions
Consulting





From:   Lonikar, Kiran kloni...@informatica.com
To: user@hive.apache.org user@hive.apache.org
Date:   07/09/2015 02:36 PM
Subject:RE: Add file/jar location for jdbc clients



Never mind... It works for HDFS. I can copy to HDFS and specify the HDFS 
file path as hdfs://namenode/path/to/file.

-Kiran

-Original Message-
From: Lonikar, Kiran [mailto:kloni...@informatica.com] 
Sent: Wednesday, July 08, 2015 9:01 PM
To: user@hive.apache.org
Subject: Add file/jar location for jdbc clients

Hi,

The add jar/file syntax stated on hive language manual is as below:

ADD { FILE[S] | JAR[S] | ARCHIVE[S] } filepath1 [filepath2]*

I have an application which remotely connects to