[
https://issues.apache.org/jira/browse/HADOOP-19785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18052770#comment-18052770
]
ASF GitHub Bot commented on HADOOP-19785:
-----------------------------------------
slfan1989 commented on PR #8182:
URL: https://github.com/apache/hadoop/pull/8182#issuecomment-3766811692
> Hi @zhtttylz, you've implemented new custom Doclet to support JDK17 in
#8038, however, unfortunately `mvn site` fails in Doclet part. The detail is
below:
>
> 1. RootDocProcessor.process(env) creates a Proxy object that implements
DocletEnvironment interface
> 2. This proxy is passed to StandardDoclet.run(filtered)
> 3. Inside the StandardDoclet, the javadoc internals (specifically
WorkArounds.()) try to cast this proxy to the concrete implementation class
jdk.javadoc.internal.tool.DocEnvImpl:
https://github.com/jonathan-gibbons/jdk/blob/bc1e60c1bf91678ef18652a00aa2ce55b0446caa/src/jdk.javadoc/share/classes/jdk/javadoc/internal/doclets/toolkit/WorkArounds.java#L112
> 4. The cast fails because a proxy object cannot be cast to a concrete
implementation class
>
> It seems our custom doclet implementation is prohibited after
https://bugs.openjdk.org/browse/JDK-8253736:
>
> > One particularly annoying wart is the cast on DocletEnvironment to
DocEnvImpl, which effectively prevents using subtypes to carry additional info.
It is not clear (even now) what the best way is to replace that logic.
>
> Now I feel it's becoming really hard to maintain Hadoop's custom Doclets,
and therefore I would like to drop the custom implementation. The primary
change is we are going to build Hadoop JavaDoc with `@LimitedPrivate`,
`@Private` or `@Unstable` classes, which are now excluded by our custom Doclets.
>
> @slfan1989 @cnauroth @zhtttylz What do you think?
@aajisaka Thanks for the detailed analysis — after reading through it, I
fully agree that the cost of maintaining this custom Doclet has become
unreasonably high. With OpenJDK continuing to clean up internal APIs (the trend
starting from JDK-8253736 is only getting stronger), future compatibility will
only get worse, and the next LTS might break it completely.
I'm in favor of dropping the custom Doclet and switching to the standard
doclet to generate complete JavaDocs (including all classes annotated with
@Private / @Unstable / @LimitedPrivate). The main reasons are:
- The maintenance burden is too heavy and takes away energy from more
valuable work;
- The visibility of these annotated classes has very limited impact on most
downstream users — they shouldn't be depending on @Private APIs anyway;
- On the positive side, having the full picture can actually help
developers/contributors who want to dig into the implementation details.
cc: @zhtttylz @cnauroth
> mvn site fails in JDK17
> -----------------------
>
> Key: HADOOP-19785
> URL: https://issues.apache.org/jira/browse/HADOOP-19785
> Project: Hadoop Common
> Issue Type: Sub-task
> Components: build, site
> Reporter: Akira Ajisaka
> Assignee: Akira Ajisaka
> Priority: Major
> Labels: pull-request-available
>
> mvn site fails in JDK17:
> https://github.com/apache/hadoop/actions/runs/20970062036/job/60270782439
> {noformat}
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-common-project/hadoop-kms/src/main/java/org/apache/hadoop/crypto/key/kms/server/KMSWebApp.java:26:
> error: cannot find symbol
> Error: import com.codahale.metrics.JmxReporter;
> Error: ^
> Error: symbol: class JmxReporter
> Error: location: package com.codahale.metrics
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-common-project/hadoop-kms/src/main/java/org/apache/hadoop/crypto/key/kms/server/KMSWebApp.java:69:
> error: cannot find symbol
> Error: private JmxReporter jmxReporter;
> Error: ^
> Error: symbol: class JmxReporter
> Error: location: class KMSWebApp
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:36:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.MiniDFSCluster;
> Error: ^
> Error: symbol: class MiniDFSCluster
> Error: location: package org.apache.hadoop.hdfs
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:41:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.server.datanode.DataNodeTestUtils;
> Error: ^
> Error: symbol: class DataNodeTestUtils
> Error: location: package org.apache.hadoop.hdfs.server.datanode
> Error:
> /home/runner/work/hadoop/hadoop/hadoop-tools/hadoop-dynamometer/hadoop-dynamometer-infra/src/main/java/org/apache/hadoop/tools/dynamometer/SimulatedDataNodes.java:42:
> error: cannot find symbol
> Error: import org.apache.hadoop.hdfs.server.datanode.SimulatedFSDataset;
> Error: ^
> Error: symbol: class SimulatedFSDataset
> Error: location: package org.apache.hadoop.hdfs.server.datanode
> Error: [ERROR] 5 errors
> {noformat}
> even after fixing the above error by adding dependencies and upgrading
> plugins, I'm facing the below error in Hadoop's custom Doclet:
> {noformat}
> [ERROR] java.lang.ClassCastException: class jdk.proxy1.$Proxy0 cannot be cast
> to
> class jdk.javadoc.internal.tool.DocEnvImpl (jdk.proxy1.$Proxy0 is in module
> jdk.proxy1 of loader 'app'; jdk.javadoc.internal.tool.DocEnvImpl is in module
> jdk.javadoc of
> loader 'app')
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.WorkArounds.<init>(WorkArounds.java:104)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.BaseConfiguration.initConfiguration(BaseConfiguration.java:251)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.formats.html.HtmlConfiguration.initConfiguration(HtmlConfiguration.java:220)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.doclets.toolkit.AbstractDoclet.run(AbstractDoclet.java:104)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.doclet.StandardDoclet.run(StandardDoclet.java:103)
> [ERROR] at
> org.apache.hadoop.classification.tools.IncludePublicAnnotationsStandardDoclet.run(IncludePublicAnnotationsStandardDoclet.java:152)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.parseAndExecute(Start.java:556)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.begin(Start.java:393)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Start.begin(Start.java:342)
> [ERROR] at
> jdk.javadoc/jdk.javadoc.internal.tool.Main.execute(Main.java:63)
> [ERROR] at jdk.javadoc/jdk.javadoc.internal.tool.Main.main(Main.java:52)
> [ERROR] 2 errors
> {noformat}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]