[
https://issues.apache.org/jira/browse/HADOOP-19785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18052015#comment-18052015
]
ASF GitHub Bot commented on HADOOP-19785:
-----------------------------------------
aajisaka commented on PR #8182:
URL: https://github.com/apache/hadoop/pull/8182#issuecomment-3753598322
Hi @zhtttylz, you've implemented new custom Doclet to support JDK17 in
https://github.com/apache/hadoop/pull/8038, however, unfortunately `mvn site`
fails in Doclet part. The detail is below:
1. RootDocProcessor.process(env) creates a Proxy object that implements
DocletEnvironment interface
2. This proxy is passed to StandardDoclet.run(filtered)
3. Inside the StandardDoclet, the javadoc internals (specifically
WorkArounds.<init>()) try to cast this proxy to the concrete implementation
class jdk.javadoc.internal.tool.DocEnvImpl:
https://github.com/jonathan-gibbons/jdk/blob/bc1e60c1bf91678ef18652a00aa2ce55b0446caa/src/jdk.javadoc/share/classes/jdk/javadoc/internal/doclets/toolkit/WorkArounds.java#L112
4. The cast fails because a proxy object cannot be cast to a concrete
implementation class
It seems our custom doclet implementation is prohibited after
https://bugs.openjdk.org/browse/JDK-8253736:
> One particularly annoying wart is the cast on DocletEnvironment to
DocEnvImpl, which effectively prevents using subtypes to carry additional info.
It is not clear (even now) what the best way is to replace that logic.
Now I feel it's becoming really hard to maintain Hadoop's custom Doclets,
and therefore I would like to drop the custom implementation. The primary
change is we are going to build Hadoop JavaDoc with `@LimitedPrivate`,
`@Private` or `@Unstable` classes, which are now excluded by our custom Doclets.
@slfan1989 @cnauroth @zhtttylz What do you think?
> mvn site fails in JDK17
> -----------------------
>
> Key: HADOOP-19785
> URL: https://issues.apache.org/jira/browse/HADOOP-19785
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: build, site
> Reporter: Akira Ajisaka
> Assignee: Akira Ajisaka
> Priority: Major
> Labels: pull-request-available
>
> mvn site fails in JDK17:
> https://github.com/apache/hadoop/actions/runs/20970062036/job/60270782439
> {noformat}
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-common-project/hadoop-kms/src/main/java/org/apache/hadoop/crypto/key/kms/server/KMSWebApp.java:26:
> error: cannot find symbol
> Error: import com.codahale.metrics.JmxReporter;
> Error: ^
> Error: symbol: class JmxReporter
> Error: location: package com.codahale.metrics
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-common-project/hadoop-kms/src/main/java/org/apache/hadoop/crypto/key/kms/server/KMSWebApp.java:69:
> error: cannot find symbol
> Error: private JmxReporter jmxReporter;
> Error: ^
> Error: symbol: class JmxReporter
> Error: location: class KMSWebApp
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:36:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.MiniDFSCluster;
> Error: ^
> Error: symbol: class MiniDFSCluster
> Error: location: package org.apache.hadoop.hdfs
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:41:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.server.datanode.DataNodeTestUtils;
> Error: ^
> Error: symbol: class DataNodeTestUtils
> Error: location: package org.apache.hadoop.hdfs.server.datanode
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:42:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.server.datanode.SimulatedFSDataset;
> Error: ^
> Error: symbol: class SimulatedFSDataset
> Error: location: package org.apache.hadoop.hdfs.server.datanode
> Error: [ERROR] 5 errors
> {noformat}
> even after fixing the above error by adding dependencies and upgrading
> plugins, I'm facing the below error in Hadoop's custom Doclet:
> {noformat}
> [ERROR] java.lang.ClassCastException: class jdk.proxy1.$Proxy0 cannot be cast
> to
> class jdk.javadoc.internal.tool.DocEnvImpl (jdk.proxy1.$Proxy0 is in module
> jdk.proxy1 of loader 'app'; jdk.javadoc.internal.tool.DocEnvImpl is in module
> jdk.javadoc of
> loader 'app')
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.WorkArounds.<init>(WorkArounds.java:104)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.BaseConfiguration.initConfiguration(BaseConfiguration.java:251)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.formats.html.HtmlConfiguration.initConfiguration(HtmlConfiguration.java:220)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.AbstractDoclet.run(AbstractDoclet.java:104)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.doclet.StandardDoclet.run(StandardDoclet.java:103)
> [ERROR] at
> org.apache.hadoop.classification.tools.IncludePublicAnnotationsStandardDoclet.run(IncludePublicAnnotationsStandardDoclet.java:152)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.parseAndExecute(Start.java:556)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.begin(Start.java:393)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.begin(Start.java:342)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Main.execute(Main.java:63)
> [ERROR] at jdk.javadoc/jdk.javadoc.internal.tool.Main.main(Main.java:52)
> [ERROR] 2 errors
> {noformat}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]