Re: For Apache Hive HS2 , what is the largest heap size setting that works well?

2017-11-28 Thread Jörn Franke
I also recommend it you will have also performance improvements with JDK8 in general (use the latest version). Keep also in mind that more and more big data libraries etc will drop JDK7 support soon (Aside that JDK7 is anyway not maintained anymore). > On 29. Nov 2017, at 01:31, Johannes

Re: For Apache Hive HS2 , what is the largest heap size setting that works well?

2017-11-28 Thread Johannes Alberti
Yes, I would recommend to go to Java 8 and give it a shot with G1 and report back :) Sent from my iPhone > On Nov 28, 2017, at 3:30 PM, Sharanya Santhanam > wrote: > > HI Johannes , > > We are running on Java version jdk1.7.0_67 . We are using >

"IS [NOT] NULL for a complex type"

2017-11-28 Thread Jin Chul Kim
Hi, May I know the meaning of IS [NOT] NULL for a complex type such as STRUCT? As far as I know, we cannot assign NULL to struct directly. So, I expected them: 1) NULL returns if any of the elements in struct has NULL 2) NULL returns if all of the elements in struct have NULL By the way, my

Re: For Apache Hive HS2 , what is the largest heap size setting that works well?

2017-11-28 Thread Sharanya Santhanam
HI Johannes , We are running on Java version jdk1.7.0_67 . We are using ConcurrentMarkAndSweep. Would you recommend using G1GC ? These are our current settings -XX:NewRatio=8 -XX:+UseParNewGC -XX:-UseGCOverheadLimit -XX:PermSize=256m -Xloggc:<> -XX:HeapDumpPath=oom -XX:+PrintGCDetails

Re: For Apache Hive HS2 , what is the largest heap size setting that works well?

2017-11-28 Thread Johannes Alberti
Hi Sharanya, Can you share your current GC settings and Java version. Are you using Java 8/9 w/ G1 already? Regards, Johannes Sent from my iPhone > On Nov 28, 2017, at 12:57 PM, Sharanya Santhanam > wrote: > > Hello , > > I am currently trying to upgrade hive

[no subject]

2017-11-28 Thread Angel Francisco orta
Unsubscribe

[no subject]

2017-11-28 Thread Angel Francisco orta
Unsubscribe

For Apache Hive HS2 , what is the largest heap size setting that works well?

2017-11-28 Thread Sharanya Santhanam
Hello , I am currently trying to upgrade hive version on our prod clusters form V1.2 to v2.1 We also want to adopt HS2 on the new upgraded cluster. Earlier all queries were submitted via Hive cli. Would like to understand how large a single HS2 Heap size can be ? And is there any formula to

RE: Can't have Hive running with Spark

2017-11-28 Thread stephane.davy
So, I think I’ve made some progress, but it is still not working. - I’ve fixed the RPC issue by putting my hive-site.xml file on all spark nodes in the spark/conf directory - I’ve downgraded to Spark 2.0.2 - But I’m getting this error in Hive server logs: Query

Re: migrate hive cli to beeline

2017-11-28 Thread eric wong
Should be reasonable for me. You should use hive for version as new as possible, Memory leak issue was included in hive's jira. For our production, hiveserver2 of 2.1.1 could run for log lifetime. 2017-11-22 15:04 GMT+08:00 游垂凯 : > Hello everyone: > Recently,I want to

RE: Hive +Tez+LLAP does not have obvious performance improvement than HIVE + Tez

2017-11-28 Thread Jia, Ke A
Hi Gopal, > I have upgrade hive version to 3.0 and the somaxconn value of shuffle > port(15551) has been 16384 not 50. Thank you very much. > But I encounter the following problem when run llap, and this is same with > https://issues.apache.org/jira/browse/HIVE-10693 . Whether it is a bug of >

RE: Can't have Hive running with Spark

2017-11-28 Thread stephane.davy
Thanks Takiar, So, do you suggest I stick with Spark 1.6 or Spark 2.0.0? Which Hive version is the most appropriate? Stéphane From: Sahil Takiar [mailto:takiar.sa...@gmail.com] Sent: Monday, November 27, 2017 18:20 To: user@hive.apache.org Subject: Re: Can't have Hive running with Spark