RE: Add file/jar location for jdbc clients

2015-07-09 Thread Lonikar, Kiran
Never mind... It works for HDFS. I can copy to HDFS and specify the HDFS file 
path as hdfs://namenode/path/to/file.

-Kiran

-Original Message-
From: Lonikar, Kiran [mailto:kloni...@informatica.com] 
Sent: Wednesday, July 08, 2015 9:01 PM
To: user@hive.apache.org
Subject: Add file/jar location for jdbc clients

Hi,

The add jar/file syntax stated on hive language manual is as below:

ADD { FILE[S] | JAR[S] | ARCHIVE[S] } filepath1 [filepath2]*

I have an application which remotely connects to hiveserver2 over jdbc. It 
needs to add local resources before execution of any hive queries. When it 
tries to execute the above add jar or add file commands, with the filepath 
being from the jdbc client's local filesystem, it gets an error. But when it 
gives path local to hiveserver2 filesystem, it works.

What are my options here? One obvious one is to do scp to hiveserver's 
filesystem to say /tmp and then do add jar.

The hive 1.2.0 allows ivy urls. I dont think i can use that since my 
application generates the files and i don't want to setup ivy repo where i will 
need to copy the files before running add jar.

Isn't there any way to specify client's filesystem paths?

-Kiran


RE: Add file/jar location for jdbc clients

2015-07-09 Thread Ajeet O
Hi All
When I am trying to run a select count(*) from table_name;  getting an 
error as given below..what could be the reason  , pls help.

hive select count(*) from u_data ;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=number
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=number
In order to set a constant number of reducers:
  set mapred.reduce.tasks=number
java.net.UnknownHostException: 01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1484)
at 
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:439)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at 
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at 
org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at 
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
at 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:144)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
at 
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at 
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at 
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.UnknownHostException: 01hw357381.tcsgegdc.com: unknown 
error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:907)
at 
java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1302)
at java.net.InetAddress.getLocalHost(InetAddress.java:1479)
... 34 more
Job Submission failed with exception 
'java.net.UnknownHostException(01hw357381.tcsgegdc.com: 
01hw357381.tcsgegdc.com: unknown error)'
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
hive



Regards
Ajeet Ojha
Tata Consultancy Services
Ph:- 0120 6744953
Buzz:- 412-4953
Cell:- 9811220828
Mailto: ajee...@tcs.com
Website: http://www.tcs.com

Experience certainty.   IT Services
Business Solutions
Consulting





From:   Lonikar, Kiran kloni...@informatica.com
To: user@hive.apache.org user@hive.apache.org
Date:   07/09/2015 02:36 PM
Subject:RE: Add file/jar location for jdbc clients



Never mind... It works for HDFS. I can copy to HDFS and specify the HDFS 
file path as hdfs://namenode/path/to/file.

-Kiran

-Original Message-
From: Lonikar, Kiran [mailto:kloni...@informatica.com] 
Sent: Wednesday, July 08, 2015 9:01 PM
To: user@hive.apache.org
Subject: Add file/jar location for jdbc clients

Hi,

The add jar/file syntax stated on hive language manual is as below:

ADD { FILE[S] | JAR[S] | ARCHIVE[S] } filepath1 [filepath2]*

I have an application which remotely connects