Well, partly. As long as Druid doesn't actually create off-heap Memory, it
works OK with JDK11. Currently compiling with JDK11 will show errors, but
those errors can be ignored as long as they don't create off-heap Memory,
which they do not do.
This is an issue we will begin addressing soon, we
My understanding was that the Apache Datasketch project was still having
issues with Java 11 and direct memory support in their sketch
implementation. That would impact Druid's use of datasketches, and possibly
Druid's use of direct memory as well. Does anyone know whether the
Datasketch issues
Build Update for apache/druid
-
Build: #29230
Status: Fixed
Duration: 4 mins and 25 secs
Commit: 8366717 (master)
Author: Lucas Capistrant
Message: Add missing coordinator dynamic config to the web-console dialog for
dynamic coordinator config (#10545)
* Add
There are many projects that have Druid integration and use jars created by
Druid releases. Druid itself has hadoop based batch ingestion feature where
it ships its own jars over to hadoop and the code is executed in hadoop
cluster. For these integrations to continue working, it is essential that
I have no objections to dropping support for JDK 8, but why only support
JDK 11? What would it take to support JDK 11 and all newer JDKs?
There's an analogy with starting and stopping a train. Once you have
stopped a train (settled on a particular JDK release for a number of years)
starting it is
Hi Druids!
I've been thinking about what it would take for us to remove support for
JDK 8 in the 0.21 release, and officially add support for JDK 11.
I see that unit tests for jdk11 were added about 1+ year ago in Aug 2019 -
https://github.com/apache/druid/pull/8400
And integration tests were
Hello,
This info about q0 and q1 is good to know, I will use it, thank you !
As a user in order to plot that I would be glad to get the split points
alongside the histogram values.
It would be as useful to retrieve them from the `numBins` or `splitPoints` for
consistency indeed: when I need to