I also recommend it you will have also performance improvements with JDK8 in
general (use the latest version).
Keep also in mind that more and more big data libraries etc will drop JDK7
support soon (Aside that JDK7 is anyway not maintained anymore).
> On 29. Nov 2017, at 01:31, Johannes
Yes, I would recommend to go to Java 8 and give it a shot with G1 and report
back :)
Sent from my iPhone
> On Nov 28, 2017, at 3:30 PM, Sharanya Santhanam
> wrote:
>
> HI Johannes ,
>
> We are running on Java version jdk1.7.0_67 . We are using
>
Hi,
May I know the meaning of IS [NOT] NULL for a complex type such as STRUCT?
As far as I know, we cannot assign NULL to struct directly.
So, I expected them:
1) NULL returns if any of the elements in struct has NULL
2) NULL returns if all of the elements in struct have NULL
By the way, my
HI Johannes ,
We are running on Java version jdk1.7.0_67 . We are using
ConcurrentMarkAndSweep. Would you recommend using G1GC ?
These are our current settings
-XX:NewRatio=8 -XX:+UseParNewGC -XX:-UseGCOverheadLimit -XX:PermSize=256m
-Xloggc:<> -XX:HeapDumpPath=oom -XX:+PrintGCDetails
Hi Sharanya,
Can you share your current GC settings and Java version. Are you using Java 8/9
w/ G1 already?
Regards,
Johannes
Sent from my iPhone
> On Nov 28, 2017, at 12:57 PM, Sharanya Santhanam
> wrote:
>
> Hello ,
>
> I am currently trying to upgrade hive
Unsubscribe
Unsubscribe
Hello ,
I am currently trying to upgrade hive version on our prod clusters form
V1.2 to v2.1
We also want to adopt HS2 on the new upgraded cluster. Earlier all queries
were submitted via Hive cli.
Would like to understand how large a single HS2 Heap size can be ? And is
there any formula to
So, I think I’ve made some progress, but it is still not working.
- I’ve fixed the RPC issue by putting my hive-site.xml file on all
spark nodes in the spark/conf directory
- I’ve downgraded to Spark 2.0.2
- But I’m getting this error in Hive server logs:
Query
Should be reasonable for me.
You should use hive for version as new as possible, Memory leak issue was
included in hive's jira.
For our production, hiveserver2 of 2.1.1 could run for log lifetime.
2017-11-22 15:04 GMT+08:00 游垂凯 :
> Hello everyone:
> Recently,I want to
Hi Gopal,
> I have upgrade hive version to 3.0 and the somaxconn value of shuffle
> port(15551) has been 16384 not 50. Thank you very much.
> But I encounter the following problem when run llap, and this is same with
> https://issues.apache.org/jira/browse/HIVE-10693 . Whether it is a bug of
>
Thanks Takiar,
So, do you suggest I stick with Spark 1.6 or Spark 2.0.0? Which Hive version is
the most appropriate?
Stéphane
From: Sahil Takiar [mailto:takiar.sa...@gmail.com]
Sent: Monday, November 27, 2017 18:20
To: user@hive.apache.org
Subject: Re: Can't have Hive running with Spark
12 matches
Mail list logo