[ 
https://issues.apache.org/jira/browse/SPARK-26838?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-26838.
----------------------------------
    Resolution: Invalid

Sounds more like a question since all the current Jenkins jobs are running 
fine. Looks specific to application side. Let's interact in mailing list - 
https://spark.apache.org/community.html.

> Dataframe write Parquet  java.lang.NoClassDefFoundError: Could not initialize 
> class ParquetOptions 
> ---------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26838
>                 URL: https://issues.apache.org/jira/browse/SPARK-26838
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Sanket
>            Priority: Major
>
> Hi,
> I am using spark to write a Java Dataset<Row> with all string columns to 
> parquet file using following command
>  
> {{data.write().option("compression",snappy").format("parquet").mode(SaveMode.Overwrite).save(filePath);}}{{}}
>  
> I facing following error.
>  
> {panel:title=Error}
>  
> java.lang.NoClassDefFoundError: Could not initialize class 
> org.apache.spark.sql.execution.datasources.parquet.ParquetOptions$ at 
> org.apache.spark.sql.execution.datasources.parquet.ParquetOptions.(ParquetOptions.scala:55)
>  at 
> org.apache.spark.sql.execution.datasources.parquet.ParquetOptions.(ParquetOptions.scala:39)
>  at 
> org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.prepareWrite(ParquetFileFormat.scala:80)
>  at 
> org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:103)
>  at 
> org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:159)
>  at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
>  at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
>  at 
> org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>  at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152) at 
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127) at 
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>  at 
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80) 
> at 
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
>  at 
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
>  at 
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>  at 
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>  at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>  at 
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668) at 
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
>  at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270) at 
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228) at 
> package.common.dao.service.spark.utils.SparkDataBaseUtilImpl.writeParquetFile(SparkDataBaseUtilImpl.java:121)
>  at 
> package.common.dao.service.spark.SparkImportServiceImpl.csvFileImport(SparkImportServiceImpl.java:29)
>  at 
> package.common.dao.service.spark.SparkDaoServiceImpl.dataImport(SparkDaoServiceImpl.java:70)
>  at 
> package.common.dao.service.spark.SparkDaoServiceImpl.executeOperations(SparkDaoServiceImpl.java:37)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>  at 
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>  at 
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
>  at 
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
>  at 
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
>  at 
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
>  at 
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
>  at 
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
>  at 
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
>  at 
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
>  at 
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
>  at 
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
>  at 
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
>  at 
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:540)
>  at 
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:715)
>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:742) at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
>  at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
>  at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at 
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
>  at 
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
>  at 
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
>  at 
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
>  at 
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:493)
>  at 
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) 
> at 
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81) 
> at 
> org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:650)
>  at 
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
>  at 
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342) 
> at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800) 
> at 
> org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
>  at 
> org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:806)
>  at 
> org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1498)
>  at 
> org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at 
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
>  at java.lang.Thread.run(Thread.java:748)
>  
>  
> {panel}
> I am using Spark 2.4.0 and all jar present in distribution.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to