saumyasuhagiya opened a new issue #1961:
URL: https://github.com/apache/hudi/issues/1961


   **_Tips before filing an issue_**
   
   - Have you gone through our 
[FAQs](https://cwiki.apache.org/confluence/display/HUDI/FAQ)?
   
   - Join the mailing list to engage in conversations and get faster support at 
dev-subscr...@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   Getting java.lang.NoSuchMethodError: 
org.eclipse.jetty.server.session.SessionHandler.setHttpOnly(Z)V on Azure 
Databricks 
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
             <dependency>
               <groupId>com.microsoft.pnp</groupId>
               <artifactId>spark-listeners_2.11_2.4.3</artifactId>
               <version>1.0.0</version>
               <exclusions>
                   <exclusion>
                       <groupId>org.eclipse.jetty.aggregate</groupId>
                       <artifactId>jetty-all</artifactId>
                   </exclusion>
               </exclusions>
           </dependency>
           
           <!-- https://mvnrepository.com/artifact/org.apache.hudi/hudi-spark 
-->
           <dependency>
               <groupId>org.apache.hudi</groupId>
               <artifactId>hudi-spark_2.11</artifactId>
               <version>0.5.3</version>
               <exclusions>
                   <exclusion>
                       <groupId>org.eclipse.jetty.aggregate</groupId>
                       <artifactId>jetty-all</artifactId>
                   </exclusion>
                   <exclusion>
                       <groupId>org.apache.hive</groupId>
                       <artifactId>hive-shims</artifactId>
                   </exclusion>
               </exclusions>
           </dependency>
           <dependency>
               <groupId>org.apache.hudi</groupId>
               <artifactId>hudi-hadoop-mr</artifactId>
               <version>0.5.3</version>
           </dependency>
   
           <!-- 
https://mvnrepository.com/artifact/org.apache.hudi/hudi-spark-bundle -->
           <dependency>
               <groupId>org.apache.hudi</groupId>
               <artifactId>hudi-spark-bundle_2.11</artifactId>
               <version>0.5.3</version>
               <exclusions>
                   <exclusion>
                       <groupId>org.eclipse.jetty.aggregate</groupId>
                       <artifactId>jetty-all</artifactId>
                   </exclusion>
               </exclusions>
           </dependency>
           <!-- 
https://mvnrepository.com/artifact/org.eclipse.jetty/jetty-server -->
           <dependency>
               <groupId>org.eclipse.jetty</groupId>
               <artifactId>jetty-server</artifactId>
               <version>9.4.31.v20200723</version>
           </dependency>
   
   I have the above relevant dependencies in Spark Job. Also adding 
hudi_spark_bundle_2.11_0.5.3 in --jars option.
   
   Compile and 
   
   
   **Expected behavior**
   
   It should run succesfully
   
   **Environment Description**
   
   * Hudi version :
   
   * Spark version : 2.4.5
   
   * Hive version : 0.5.3
   
   * Hadoop version : As shown in dependency
   
   * Storage (HDFS/S3/GCS..) :  ADLSGen2
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
support was removed in 8.0
   20/08/13 08:10:18 ERROR Uncaught throwable from user code: 
java.lang.NoSuchMethodError: 
org.eclipse.jetty.server.session.SessionHandler.setHttpOnly(Z)V
        at 
io.javalin.core.util.JettyServerUtil.defaultSessionHandler(JettyServerUtil.kt:50)
        at io.javalin.Javalin.<init>(Javalin.java:94)
        at io.javalin.Javalin.create(Javalin.java:107)
        at 
org.apache.hudi.timeline.service.TimelineService.startService(TimelineService.java:102)
        at 
org.apache.hudi.client.embedded.EmbeddedTimelineService.startServer(EmbeddedTimelineService.java:74)
        at 
org.apache.hudi.client.AbstractHoodieClient.startEmbeddedServerView(AbstractHoodieClient.java:102)
        at 
org.apache.hudi.client.AbstractHoodieClient.<init>(AbstractHoodieClient.java:69)
        at 
org.apache.hudi.client.AbstractHoodieWriteClient.<init>(AbstractHoodieWriteClient.java:83)
        at 
org.apache.hudi.client.HoodieWriteClient.<init>(HoodieWriteClient.java:137)
        at 
org.apache.hudi.client.HoodieWriteClient.<init>(HoodieWriteClient.java:124)
        at 
org.apache.hudi.client.HoodieWriteClient.<init>(HoodieWriteClient.java:120)
        at 
org.apache.hudi.DataSourceUtils.createHoodieClient(DataSourceUtils.java:195)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:135)
        at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:108)
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:147)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:135)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$5.apply(SparkPlan.scala:188)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:184)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:135)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:118)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:116)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
        at 
org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:113)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:242)
        at 
org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:99)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:172)
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:711)
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:307)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:235)
        at 
com.walmart.ods.datapipeline.write.writers.HudiWriter.write(HudiWriter.java:44)
        at 
com.walmart.ods.datapipeline.driver.jobs.job.FileTypeConversion.fileTypeConvert(FileTypeConversion.java:66)
        at 
com.walmart.ods.datapipeline.driver.jobs.job.FileTypeConversion.run(FileTypeConversion.java:35)
        at com.walmart.ods.datapipeline.driver.App.run(App.java:39)
        at com.walmart.ods.datapipeline.driver.App.main(App.java:23)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command--1:1)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw$$iw$$iw$$iw$$iw.<init>(command--1:44)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw$$iw$$iw$$iw.<init>(command--1:46)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw$$iw$$iw.<init>(command--1:48)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw$$iw.<init>(command--1:50)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$$iw.<init>(command--1:52)
        at line1d62afb8a27a422cb3f1cf4cd7fad43825.$read.<init>(command--1:54)
        at line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$.<init>(command--1:58)
        at line1d62afb8a27a422cb3f1cf4cd7fad43825.$read$.<clinit>(command--1)
        at 
line1d62afb8a27a422cb3f1cf4cd7fad43825.$eval$.$print$lzycompute(<notebook>:7)
        at line1d62afb8a27a422cb3f1cf4cd7fad43825.$eval$.$print(<notebook>:6)
        at line1d62afb8a27a422cb3f1cf4cd7fad43825.$eval.$print(<notebook>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
        at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
        at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
        at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
        at 
scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
        at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
        at 
scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
        at 
com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
        at 
com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
        at 
com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
        at 
com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
        at 
com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
        at 
com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
        at 
com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
        at 
com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
        at 
com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
        at 
com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at 
com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
        at 
com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
        at 
com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
        at 
com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
        at 
com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
        at 
com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
        at 
com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
        at scala.util.Try$.apply(Try.scala:192)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
        at 
com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
        at java.lang.Thread.run(Thread.java:748)
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to