I tried testing hadoop with java 11 using the quickstart tutorial, but ran
into the following issue on peon before it even got to hadoop


2019-08-28T22:14:56,959 ERROR [task-runner-0-priority-0]
org.apache.druid.indexing.common.task.HadoopIndexTask - Got invocation
target exception in run(), cause:
java.lang.NoClassDefFoundError: javax/script/ScriptEngineManager
at
org.apache.logging.log4j.core.script.ScriptManager.<init>(ScriptManager.java:49)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:179)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:209)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:492)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:562)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
~[log4j-core-2.5.jar:2.5]
at
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
~[log4j-core-2.5.jar:2.5]
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
~[log4j-api-2.5.jar:2.5]
at
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
~[log4j-api-2.5.jar:2.5]
at
org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
~[log4j-slf4j-impl-2.5.jar:2.5]
at
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
~[log4j-api-2.5.jar:2.5]
at
org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
~[log4j-slf4j-impl-2.5.jar:2.5]
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
~[slf4j-api-1.7.25.jar:1.7.25]
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
~[slf4j-api-1.7.25.jar:1.7.25]
at org.apache.druid.java.util.common.logger.Logger.<init>(Logger.java:38)
[druid-core-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.guice.PropertiesModule.<clinit>(PropertiesModule.java:42)
~[druid-processing-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.guice.GuiceInjectors.makeDefaultStartupModules(GuiceInjectors.java:39)
~[druid-processing-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:56)
~[druid-processing-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:103)
~[druid-indexing-hadoop-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessingRunner.runTask(HadoopIndexTask.java:644)
~[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method) ~[?:?]
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[?:?]
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[?:?]
at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at
org.apache.druid.indexing.common.task.HadoopIndexTask.runInternal(HadoopIndexTask.java:353)
~[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexing.common.task.HadoopIndexTask.runTask(HadoopIndexTask.java:287)
[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexing.common.task.AbstractBatchIndexTask.run(AbstractBatchIndexTask.java:137)
[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:419)
[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at
org.apache.druid.indexing.overlord.SingleTaskBackgroundRunner$SingleTaskBackgroundRunnerCallable.call(SingleTaskBackgroundRunner.java:391)
[druid-indexing-service-0.16.0-incubating-SNAPSHOT.jar:0.16.0-incubating-SNAPSHOT]
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[?:?]
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[?:?]
at java.base/java.lang.Thread.run(Thread.java:834) [?:?]
Caused by: java.lang.ClassNotFoundException:
javax.script.ScriptEngineManager
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
~[?:?]
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588) ~[?:?]
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ~[?:?]
... 35 more


https://stackoverflow.com/questions/53714010/log4j2-slf4j-and-java-11 suggests
adding '--add-modules java.scripting', but it did not appear to help
anything when added to java opts of peons. It is not related to the java
opts having any sort of failure to accept these module arguments, i was
able to successfully pass the option to make jvm gc metrics work, and
attempting to run the task with the new 'indexer' service type where the
tasks are threads instead of forked processes was not successful with this
option set either. I haven't dug in any further yet, but I wouldn't
personally consider this a deal-breaker in terms of thinking of java 9+
support as experimental, rather a known issue.


On Tue, Aug 27, 2019 at 1:13 PM Xavier Léauté <xav...@confluent.io> wrote:

> >
> > - the level of testing that has been done so far by community members
> > (including of various extensions and ingestion methods); related, it
> would
> > be helpful to know if anyone has been running Druid in production on Java
> > 11
> >
>
> I have not run actually run Druid with Java 11 yet.
> Most of my efforts so far have been to get unit tests to pass.
>
>
> > - if there are any known remaining issues that need to be addressed
> >
>
> The next step would be to upgrade our integration test framework,
> and run integration tests with Java 11.
>
> - in which release does the community feel comfortable declaring official
> > Java 11 support
> >
>
> If we get integration tests to pass before 0.16, we can ask people to try
> it out,
> and probably mark Java 11 support experimental
>
>
> > Xavier, do you know
> > of any significant areas of the codebase that need to be looked at before
> > we can declare Java 11 support? What about issues/restrictions around
> > Hadoop integration?
>
>
> On the hadoop integration side we would need help from someone running
> hadoop.
> We should at least verify that Druid running Java 11 can successfully
> submit tasks
> to a hadoop cluster and make sure there aren't any odd classpath issues.
> Hadoop itself doesn't support Java 11 yet, so the jobs themselves would
> still run with Java 8.
>

Reply via email to