[ 
https://issues.apache.org/jira/browse/HDDS-1333?focusedWorklogId=221405&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-221405
 ]

ASF GitHub Bot logged work on HDDS-1333:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 01/Apr/19 18:33
            Start Date: 01/Apr/19 18:33
    Worklog Time Spent: 10m 
      Work Description: hadoop-yetus commented on issue #653: HDDS-1333. 
OzoneFileSystem can't work with spark/hadoop2.7 because incompatible security 
classes
URL: https://github.com/apache/hadoop/pull/653#issuecomment-478693613
 
 
   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |:----:|----------:|--------:|:--------|
   | 0 | reexec | 29 | Docker mode activated. |
   ||| _ Prechecks _ |
   | 0 | yamllint | 0 | yamllint was not available. |
   | +1 | @author | 1 | The patch does not contain any @author tags. |
   | +1 | test4tests | 0 | The patch appears to include 1 new or modified test 
files. |
   ||| _ ozone-0.4 Compile Tests _ |
   | 0 | mvndep | 36 | Maven dependency ordering for branch |
   | +1 | mvninstall | 1283 | ozone-0.4 passed |
   | +1 | compile | 1185 | ozone-0.4 passed |
   | +1 | checkstyle | 231 | ozone-0.4 passed |
   | -1 | mvnsite | 66 | common in ozone-0.4 failed. |
   | -1 | mvnsite | 46 | ozonefs in ozone-0.4 failed. |
   | +1 | shadedclient | 796 | branch has no errors when building and testing 
our client artifacts. |
   | 0 | findbugs | 0 | Skipped patched modules with no Java source: 
hadoop-hdds/docs hadoop-ozone/dist |
   | -1 | findbugs | 38 | common in ozone-0.4 failed. |
   | -1 | findbugs | 35 | ozonefs in ozone-0.4 failed. |
   | +1 | javadoc | 145 | ozone-0.4 passed |
   ||| _ Patch Compile Tests _ |
   | 0 | mvndep | 25 | Maven dependency ordering for patch |
   | -1 | mvninstall | 28 | common in the patch failed. |
   | -1 | mvninstall | 24 | dist in the patch failed. |
   | -1 | mvninstall | 26 | ozonefs in the patch failed. |
   | +1 | compile | 956 | the patch passed |
   | +1 | javac | 956 | the patch passed |
   | +1 | checkstyle | 212 | the patch passed |
   | -1 | mvnsite | 45 | common in the patch failed. |
   | -1 | mvnsite | 39 | ozonefs in the patch failed. |
   | +1 | shellcheck | 0 | There were no new shellcheck issues. |
   | +1 | shelldocs | 31 | There were no new shelldocs issues. |
   | +1 | whitespace | 0 | The patch has no whitespace issues. |
   | +1 | xml | 1 | The patch has no ill-formed XML file. |
   | +1 | shadedclient | 749 | patch has no errors when building and testing 
our client artifacts. |
   | 0 | findbugs | 0 | Skipped patched modules with no Java source: 
hadoop-hdds/docs hadoop-ozone/dist |
   | -1 | findbugs | 38 | common in the patch failed. |
   | -1 | findbugs | 37 | ozonefs in the patch failed. |
   | +1 | javadoc | 144 | the patch passed |
   ||| _ Other Tests _ |
   | +1 | unit | 33 | docs in the patch passed. |
   | -1 | unit | 43 | common in the patch failed. |
   | +1 | unit | 36 | dist in the patch passed. |
   | -1 | unit | 37 | ozonefs in the patch failed. |
   | +1 | asflicense | 48 | The patch does not generate ASF License warnings. |
   | | | 6955 | |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | Client=17.05.0-ce Server=17.05.0-ce base: 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/653 |
   | Optional Tests |  dupname  asflicense  mvnsite  compile  javac  javadoc  
mvninstall  unit  shadedclient  findbugs  checkstyle  shellcheck  shelldocs  
xml  yamllint  |
   | uname | Linux c40fcfbeb157 4.4.0-139-generic #165~14.04.1-Ubuntu SMP Wed 
Oct 31 10:55:11 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | ozone-0.4 / fb7844d |
   | maven | version: Apache Maven 3.3.9 |
   | Default Java | 1.8.0_191 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/branch-mvnsite-hadoop-ozone_common.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/branch-mvnsite-hadoop-ozone_ozonefs.txt
 |
   | shellcheck | v0.4.6 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/branch-findbugs-hadoop-ozone_common.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/branch-findbugs-hadoop-ozone_ozonefs.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-mvninstall-hadoop-ozone_common.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-mvninstall-hadoop-ozone_dist.txt
 |
   | mvninstall | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-mvninstall-hadoop-ozone_ozonefs.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-mvnsite-hadoop-ozone_common.txt
 |
   | mvnsite | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-mvnsite-hadoop-ozone_ozonefs.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-findbugs-hadoop-ozone_common.txt
 |
   | findbugs | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-findbugs-hadoop-ozone_ozonefs.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-unit-hadoop-ozone_common.txt
 |
   | unit | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/artifact/out/patch-unit-hadoop-ozone_ozonefs.txt
 |
   |  Test Results | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/testReport/ |
   | Max. process+thread count | 340 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdds/docs hadoop-ozone/common hadoop-ozone/dist 
hadoop-ozone/ozonefs U: . |
   | Console output | 
https://builds.apache.org/job/hadoop-multibranch/job/PR-653/3/console |
   | Powered by | Apache Yetus 0.9.0 http://yetus.apache.org |
   
   
   This message was automatically generated.
   
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 221405)
    Time Spent: 4h 10m  (was: 4h)

> OzoneFileSystem can't work with spark/hadoop2.7 because incompatible security 
> classes
> -------------------------------------------------------------------------------------
>
>                 Key: HDDS-1333
>                 URL: https://issues.apache.org/jira/browse/HDDS-1333
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>            Reporter: Elek, Marton
>            Assignee: Elek, Marton
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 4h 10m
>  Remaining Estimate: 0h
>
> The current ozonefs compatibility layer is broken by: HDDS-1299.
> The spark jobs (including hadoop 2.7) can't be executed any more:
> {code}
> 2019-03-25 09:50:08 INFO  StateStoreCoordinatorRef:54 - Registered 
> StateStoreCoordinator endpoint
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/hadoop/crypto/key/KeyProviderTokenIssuer
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>         at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
>         at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at 
> org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2134)
>         at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2099)
>         at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
>         at 
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2654)
>         at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
>         at 
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
>         at 
> org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:45)
>         at 
> org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:332)
>         at 
> org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
>         at 
> org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
>         at 
> org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:715)
>         at 
> org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:757)
>         at 
> org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:724)
>         at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:45)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>         at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
>         at 
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
>         at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
>         at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>         at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.hadoop.crypto.key.KeyProviderTokenIssuer
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>         ... 43 more
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to