Re: LDAPS (Secure LDAP) Hive configuration

2016-06-15 Thread Anurag Tangri

Hey Joze,
Ldaps is a different port like 636 or something. Default port does not work as 
far as I remember. 

Could you check if something on these lines ?

Thanks,
Anurag Tangri

Sent from my iPhone

> On Jun 15, 2016, at 3:01 PM, Jose Rozanec  
> wrote:
> 
> Hi, 
> 
> We upgraded to 2.1.0, but we still cannot get it working: we get "LDAP: error 
> code 34 - invalid DN". We double-checked the DN configuration, and the ldap 
> team agrees is ok. 
> We then configured SSL parameters as well (hive.server2.use.SSL, 
> hive.server2.keystore.path, hive.server2.keystore.password), so that Hive 
> would know where the truststore is located and its password, but in that case 
> we get the following error: "SSLException: Unrecognized SSL message, 
> plaintext connection". Our LDAP server does not expose the ssl certificate on 
> the default port (443), but in the one LDAPS is configured. May that cause 
> some trouble?
> 
> We would value any insight or guidance from those who already worked on this.
> 
> Thanks!
> 
> Joze.
> 
> 
> 
>  
> 
> 2016-06-13 9:45 GMT-03:00 Jose Rozanec :
>> Thank you for the quick response. Will try upgrading to version 2.1.0
>> 
>> Thanks!
>> 
>> 2016-06-13 4:34 GMT-03:00 Oleksiy S :
>>>> Hello, 
>>>> 
>>>> We are working on a Hive 2.0.0 cluster, to configure LDAPS authentication, 
>>>> but I get some errors preventing a successful authentication.
>>>> Does anyone have some insight on how to solve this?
>>>> 
>>>> The problem
>>>> The errors we get are (first is most frequent):
>>>> - sun.security.provider.certpath.SunCertPathBuilderException: unable to 
>>>> find valid certification path to requested target
>>>> - javax.naming.InvalidNameException: [LDAP: error code 34 - invalid DN]
>>>> 
>>>> Our config
>>>> We configure the certificate obtaining a jssecacerts file and overriding 
>>>> Java's default at master, as specified in this post.
>>>> 
>>>> hive-site.xml has the following properties:
>>>>   
>>>>  hive.server2.authentication
>>>>  LDAP
>>>>   
>>>>   
>>>> hive.server2.authentication.ldap.url
>>>> ldaps://ip:port
>>>>   
>>>>   
>>>> hive.server2.authentication.ldap.baseDN
>>>> dc=net,dc=com
>>>>   
>>>> 
>>>> Thanks!
>>>> 
>>>> Joze.
>>> 
>>> 
>>> This issue is fixed here https://issues.apache.org/jira/browse/HIVE-12885 
>>> 
>>>> On Fri, Jun 10, 2016 at 10:41 PM, Jose Rozanec 
>>>>  wrote:
>>>> Hello, 
>>>> 
>>>> We are working on a Hive 2.0.0 cluster, to configure LDAPS authentication, 
>>>> but I get some errors preventing a successful authentication.
>>>> Does anyone have some insight on how to solve this?
>>>> 
>>>> The problem
>>>> The errors we get are (first is most frequent):
>>>> - sun.security.provider.certpath.SunCertPathBuilderException: unable to 
>>>> find valid certification path to requested target
>>>> - javax.naming.InvalidNameException: [LDAP: error code 34 - invalid DN]
>>>> 
>>>> Our config
>>>> We configure the certificate obtaining a jssecacerts file and overriding 
>>>> Java's default at master, as specified in this post.
>>>> 
>>>> hive-site.xml has the following properties:
>>>>   
>>>>  hive.server2.authentication
>>>>  LDAP
>>>>   
>>>>   
>>>> hive.server2.authentication.ldap.url
>>>> ldaps://ip:port
>>>>   
>>>>   
>>>> hive.server2.authentication.ldap.baseDN
>>>> dc=net,dc=com
>>>>   
>>>> 
>>>> Thanks!
>>>> 
>>>> Joze.
>>> 
>>> 
>>> 
>>> -- 
>>> Oleksiy
>> 
> 


Re: "show table" throwing strange error

2013-06-19 Thread Anurag Tangri
Looks like you use MySQL.

Can you check if your MySQL still up ?

and permissions on your hive metastore db ?

Sent from my iPhone

On Jun 19, 2013, at 6:44 PM, Mohammad Tariq  wrote:

> It actually seems to be ignoring hive-site.xml. No effect of the properties 
> set in hive-site.xml file.
> 
> Warm Regards,
> Tariq
> cloudfront.blogspot.com
> 
> 
> On Thu, Jun 20, 2013 at 7:12 AM, Mohammad Tariq  wrote:
>> It looks OK to me,
>> 
>> 
>> 
>> 
>>   javax.jdo.option.ConnectionURL
>>   
>> jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true
>> 
>> 
>> 
>>   javax.jdo.option.ConnectionDriverName
>>   com.mysql.jdbc.Driver
>> 
>> 
>> 
>>   javax.jdo.option.ConnectionUserName
>>   apache
>> 
>> 
>> 
>>   javax.jdo.option.ConnectionPassword
>>   password
>> 
>> 
>> 
>>   hive.metastore.local
>>   true
>> 
>> 
>> 
>> hive.exec.scratchdir
>> /hadoop/hive-tmp
>> Scratch space for Hive jobs
>> 
>> 
>> 
>> 
>> Anything wrong here?
>> 
>> Thank you.
>> 
>> Warm Regards,
>> Tariq
>> cloudfront.blogspot.com
>> 
>> 
>> On Thu, Jun 20, 2013 at 7:06 AM, Mapred Learn  wrote:
>>> Can you also check your hive site XML ?
>>> Is it properly formatted and connection strings correct ?
>>> 
>>> Sent from my iPhone
>>> 
>>> On Jun 19, 2013, at 6:30 PM, Mohammad Tariq  wrote:
>>> 
>>>> Hello Anurag,
>>>> 
>>>>Thank you for the quick response. Log files is full of such lines along 
>>>> with a trace that says it is some parsing related issue. But the strange 
>>>> thing is that here I can see '\00' but on the CLI it was just ' '. I am 
>>>> wondering what's with wrong with show tables;
>>>> 
>>>> line 1:79 character '\00' not supported here
>>>> line 1:80 character '\00' not supported here
>>>> 
>>>>at 
>>>> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:446)
>>>>at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:416)
>>>>at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:336)
>>>>at org.apache.hadoop.hive.ql.Driver.run(Driver.java:909)
>>>>at 
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
>>>>at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
>>>>at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
>>>>at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
>>>>at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
>>>>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>at 
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>at 
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>at java.lang.reflect.Method.invoke(Method.java:601)
>>>>at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>> 
>>>> Thanks again.
>>>> 
>>>> Warm Regards,
>>>> Tariq
>>>> cloudfront.blogspot.com
>>>> 
>>>> 
>>>> On Thu, Jun 20, 2013 at 6:53 AM, Anurag Tangri  
>>>> wrote:
>>>>> Did you check in your hive query log under /tmp to see if it says 
>>>>> something in the log ?
>>>>> 
>>>>> 
>>>>> Sent from my iPhone
>>>>> 
>>>>> On Jun 19, 2013, at 5:53 PM, Mohammad Tariq  wrote:
>>>>> 
>>>>>> Hello list,
>>>>>> 
>>>>>>  I have a hive(0.9.0) setup on my Ubuntu box running 
>>>>>> hadoop-1.0.4. Everything was going smooth till now. But today when I 
>>>>>> issued show tables I got some strange error on the CLI. Here is the 
>>>>>> error :
>>>>>> 
>>>>>> hive> show tables;
>>>>>> FAILED: Parse Error: line 1:0 character '' not supported here
>>>>>> line 1:1 character '' not supported here
>>>>>> line 1:2 character '' not supported here
>>>>>> line 1:3 character '' not supported here
>>>>>> line 1:4 character

Re: "show table" throwing strange error

2013-06-19 Thread Anurag Tangri
Did you check in your hive query log under /tmp to see if it says something in 
the log ?


Sent from my iPhone

On Jun 19, 2013, at 5:53 PM, Mohammad Tariq  wrote:

> Hello list,
> 
>  I have a hive(0.9.0) setup on my Ubuntu box running hadoop-1.0.4. 
> Everything was going smooth till now. But today when I issued show tables I 
> got some strange error on the CLI. Here is the error :
> 
> hive> show tables;
> FAILED: Parse Error: line 1:0 character '' not supported here
> line 1:1 character '' not supported here
> line 1:2 character '' not supported here
> line 1:3 character '' not supported here
> line 1:4 character '' not supported here
> line 1:5 character '' not supported here
> line 1:6 character '' not supported here
> line 1:7 character '' not supported here
> line 1:8 character '' not supported here
> line 1:9 character '' not supported here
> line 1:10 character '' not supported here
> line 1:11 character '' not supported here
> line 1:12 character '' not supported here
> line 1:13 character '' not supported here
> line 1:14 character '' not supported here
> line 1:15 character '' not supported here
> line 1:16 character '' not supported here
> line 1:17 character '' not supported here
> line 1:18 character '' not supported here
> line 1:19 character '' not supported here
> line 1:20 character '' not supported here
> line 1:21 character '' not supported here
> line 1:22 character '' not supported here
> line 1:23 character '' not supported here
> line 1:24 character '' not supported here
> line 1:25 character '' not supported here
> line 1:26 character '' not supported here
> line 1:27 character '' not supported here
> line 1:28 character '' not supported here
> line 1:29 character '' not supported here
> line 1:30 character '' not supported here
> line 1:31 character '' not supported here
> line 1:32 character '' not supported here
> line 1:33 character '' not supported here
> line 1:34 character '' not supported here
> line 1:35 character '' not supported here
> line 1:36 character '' not supported here
> line 1:37 character '' not supported here
> line 1:38 character '' not supported here
> line 1:39 character '' not supported here
> line 1:40 character '' not supported here
> line 1:41 character '' not supported here
> line 1:42 character '' not supported here
> line 1:43 character '' not supported here
> line 1:44 character '' not supported here
> line 1:45 character '' not supported here
> line 1:46 character '' not supported here
> line 1:47 character '' not supported here
> line 1:48 character '' not supported here
> line 1:49 character '' not supported here
> line 1:50 character '' not supported here
> line 1:51 character '' not supported here
> line 1:52 character '' not supported here
> line 1:53 character '' not supported here
> line 1:54 character '' not supported here
> line 1:55 character '' not supported here
> line 1:56 character '' not supported here
> line 1:57 character '' not supported here
> line 1:58 character '' not supported here
> line 1:59 character '' not supported here
> line 1:60 character '' not supported here
> line 1:61 character '' not supported here
> line 1:62 character '' not supported here
> line 1:63 character '' not supported here
> line 1:64 character '' not supported here
> line 1:65 character '' not supported here
> line 1:66 character '' not supported here
> line 1:67 character '' not supported here
> line 1:68 character '' not supported here
> line 1:69 character '' not supported here
> line 1:70 character '' not supported here
> line 1:71 character '' not supported here
> line 1:72 character '' not supported here
> line 1:73 character '' not supported here
> line 1:74 character '' not supported here
> line 1:75 character '' not supported here
> line 1:76 character '' not supported here
> line 1:77 character '' not supported here
> line 1:78 character '' not supported here
> line 1:79 character '' not supported here
> .
> .
> .
> .
> .
> .
> line 1:378 character '' not supported here
> line 1:379 character '' not supported here
> line 1:380 character '' not supported here
> line 1:381 character '' not supported here
> 
> Strangely other queries like select foo from pokes where bar = 'tariq'; are 
> working fine. Tried to search over the net but could not find anything 
> useful.Need some help.
> 
> Thank you so much for your time.
> 
> Warm Regards,
> Tariq
> cloudfront.blogspot.com


Re: Hive tmp logs

2013-05-22 Thread Anurag Tangri
Hi,
You can add Hive query log property in your hive site xml and point to the 
directory you want.

Thanks,
Anurag Tangri

Sent from my iPhone

On May 22, 2013, at 11:53 AM, Raj Hadoop  wrote:

> Hi,
>  
> My hive job logs are being written to /tmp/hadoop directory. I want to change 
> it to a different location i.e. a sub directory somehere under the 'hadoop' 
> user home directory.
> How do I change it.
>  
> Thanks,
> Ra


Re: Hive Queries

2013-02-18 Thread Anurag Tangri
Hi Manish,

If you have data on your local file system,
You can also do something like following from your local file system, without 
doing put or copyFromLocal.

$ hive -e "load data local inpath 'path on local file system' into table ;"


Thanks,
Anurag Tangri
Sent from my iPhone

On Feb 16, 2013, at 6:16 PM, "manishbh...@rocketmail.com" 
 wrote:

> 
> When you want to move data from external system to hive, this means moving 
> data to HDFS first and then point the Hive table to the file in HDFS where 
> you have exported the data.
> So, you have couple of commands like -copyFromLocal and fget which move the 
> file to hdfs. If you intent to move in real time fashion try Flume. But end 
> of the day the data movement first happens in HDFS and then hive table can be 
> loaded using Load table command.
> 
> Regards,
> Manish Bhoge
> sent by HTC device. Excuse typo.
> 
> - Reply message -
> From: "Cyrille Djoko" 
> To: 
> Subject: Hive Queries
> Date: Sat, Feb 16, 2013 1:50 AM
> 
> 
> Hi Jarcec,
> I did try Sqoop. I am running sqoop 1.4.2 --hadoop1.0.0 along with hadoop
> 1.0.4 But I keep running on the following exception.
> 
> Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
> class org.apache.hadoop.mapreduce.JobContext, but interface was expected
> 
> So I wrote a small program but all I can do is send queries to the server.
> > Hi Cyrille,
> > I'm not exactly sure what exactly you mean, so I'm more or less blindly
> > shooting, but maybe Apache Sqoop [1] might help you?
> >
> > Jarcec
> >
> > Links:
> > 1: http://sqoop.apache.org/
> >
> > On Fri, Feb 15, 2013 at 01:44:45PM -0500, Cyrille Djoko wrote:
> >> I am looking for a relatively efficient way of transferring data between
> >> a
> >> remote server and Hive without going through the hassle of storing the
> >> data first on memory before loading it to Hive.
> >> From what I have read so far there is no such command but it would not
> >> hurt to ask.
> >> Is it possible to insert data through an insert query in hive? (The
> >> equivalent to insert into table_name
> >> values (...) in xSQLx)
> >>
> >> Thank you in advance for an answer.
> >>
> >>
> >> Cyrille Djoko
> >> Data Mining Developer Intern
> >>
> >
> 
> 
> Cyrille Djoko
> 
> Agnik LLC
> Data Mining Developer Intern
> 


Re: create a hive table: always a tab space before each line

2013-01-09 Thread Anurag Tangri
Hi Richard,
You should set the format in create external table command based on the format 
of your data on HDFS.

Is your data text file or seq file on HDFS ?

Thanks,
Anurag Tangri

Sent from my iPhone

On Jan 9, 2013, at 12:49 AM, Richard   wrote:

> more information:
> 
> if I set the format as textfile, there is no tab space. 
> if I set the format as sequencefile and view the content via hadoop fs -text, 
> I saw a tab space in the head of each line.
> 
> At 2013-01-09 15:44:00,Richard  wrote:
> hi there
> 
> I have a problem with creating a hive table.
> no matter what field delimiter I used, I always got a tab space in the head 
> of each line (a line is a record).
> something like this:
> \t f1 \001 f2 \001 f3 ...
> where f1 , f2 , f3 denotes the field value and \001 is the field separator.
> 
> here is the clause I used 
> 35 create external table if not exists ${HIVETBL_my_table}
>  36 (
>  37 nid string, 
>  38 userid string, 
>  39 spv bigint, 
>  40 sipv bigint, 
>  41 pay bigint, 
>  42 spay bigint, 
>  43 ipv bigint, 
>  44 sellerid string, 
>  45 cate string
>  46 )
>  47 partitioned by(ds string)
>  48 row format delimited fields terminated by '\001' lines terminated by '\n'
>  49 stored as sequencefile
>  50 location '${HADOOP_PATH_4_MY_HIVE}/${HIVETBL_my_table}';
> 
> thanks for help.
> 
> Richard
> 
> 
> 
> 


Re: Latest Pig vs Hive comparisons

2012-09-14 Thread Anurag Tangri
Knowing performance statistics would be good too.

Sent from my iPhone

On Sep 14, 2012, at 10:34 AM, Bharath Mundlapudi  wrote:

> Hello Community,
> 
> Is there any document/blog comparing different features offered by Pig 0.8 
> (0.9, 0.10) or greater and Hive 0.8 (0.9)?
> 
> -Bharath


Re: Hive job fails on hive client even though all map-red stages finish but succeeds on hive server

2012-08-11 Thread Anurag Tangri
I see exception like:

Moving data to: hdfs://../hive/atangri_test_1
FAILED: Error in metadata: org.apache.thrift.transport.TTransportException:
java.net.SocketException: Connection timed out
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask



There is enough space in /user and /tmp

Thanks,
Anurag Tangri



On Sat, Aug 11, 2012 at 12:49 AM, Jagat Singh  wrote:

> Hi Anurag,
>
> How much space is for /user and /tmp directory on client.
>
> Did you check that part? , anything which might stop move task from
> finishing.
>
> ---
> Sent from Mobile , short and crisp.
> On 11-Aug-2012 1:37 PM, "Anurag Tangri"  wrote:
>
>> Hi,
>> We are facing this issue where we run a hive job over huge data about ~6
>> TB input.
>>
>> We run this from hive client and hive metastore server is on another
>> machine.
>>
>>
>> If we have smaller input, this job succeeds but for above input size, it
>> fails with error :
>>
>> 2012-08-11 01:34:01,722 Stage-1 map = 100%,  reduce = 100%
>>
>> 2012-08-11 01:35:02,195 Stage-1 map = 100%,  reduce = 100%
>>
>> 2012-08-11 01:36:02,682 Stage-1 map = 100%,  reduce = 100%
>>
>> 2012-08-11 01:37:03,215 Stage-1 map = 100%,  reduce = 100%
>>
>> 2012-08-11 01:38:03,719 Stage-1 map = 100%,  reduce = 100%
>>
>> 2012-08-11 01:39:04,311 Stage-1 map = 100%,  reduce = 100%
>>
>> Ended Job = job_201207072204_34432
>>
>> Loading data to table default.atangri_test_1
>>
>> Failed with exception Unable to fetch table atangri_test_1
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.MoveTask
>>
>>
>> If we have smaller input (~2 TB), this job succeeds but for above input
>> size, it fails with error : We have set
>> hive.metastore.client.socket.timeout to big value like 86400  but still it
>> fails after about 8-9 hours.
>>
>> Does anyone face the same issue or any pointers ?
>>
>> The job succeeds if it is directly run on hive server.
>>
>> Thanks,
>> Anurag Tangri
>>
>


Re: Best Report Generating tools for hive/hadoop file system

2012-08-01 Thread Anurag Tangri
Cloudera has connector with microstrategy and Tableau.

Looks like Cloudera Might have better working versions in 4.x releases. Wort=
h checking.


Datameer is another tool that also connects to hive in their new release and=
let y
ou analyse data And generate reports and graphs.

Thanks,
Anurag Tangri

Sent from my iPhone

On Aug 1, 2012, at 6:35 AM, "Artem Ervits"  wrote:

> Latest eclipse birt release has Hive and Hadoop connector. 
> 
> 
> Artem Ervits 
> Data Analyst 
> New York Presbyterian Hospital
>  
> From: Techy Teck [mailto:comptechge...@gmail.com] 
> Sent: Tuesday, July 31, 2012 08:46 PM
> To: user@hive.apache.org  
> Subject: Best Report Generating tools for hive/hadoop file system 
>  
> I am looking for Open Source- Report Generating tools for Hive/Hadoop File 
> System. Can anyone suggest me which tool I should use that can connect to 
> Hive tables? 
> This electronic message is intended to be for the use only of the named 
> recipient, and may contain information that is confidential or privileged. If 
> you are not the intended recipient, you are hereby notified that any 
> disclosure, copying, distribution or use of the contents of this message is 
> strictly prohibited. If you have received this message in error or are not 
> the named recipient, please notify us immediately by contacting the sender at 
> the electronic mail address noted above, and delete and destroy all copies of 
> this message. Thank you.
>  
> 
> This electronic message is intended to be for the use only of the named 
> recipient, and may contain information that is confidential or privileged.  
> If you are not the intended recipient, you are hereby notified that any 
> disclosure, copying, distribution or use of the contents of this message is 
> strictly prohibited.  If you have received this message in error or are not 
> the named recipient, please notify us immediately by contacting the sender at 
> the electronic mail address noted above, and delete and destroy all copies of 
> this message.  Thank you.
> 
> 
> 
> This electronic message is intended to be for the use only of the named 
> recipient, and may contain information that is confidential or privileged.  
> If you are not the intended recipient, you are hereby notified that any 
> disclosure, copying, distribution or use of the contents of this message is 
> strictly prohibited.  If you have received this message in error or are not 
> the named recipient, please notify us immediately by contacting the sender at 
> the electronic mail address noted above, and delete and destroy all copies of 
> this message.  Thank you.
> 
>