[ 
https://issues.apache.org/jira/browse/SPARK-19372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16126182#comment-16126182
 ] 

poplav commented on SPARK-19372:
--------------------------------

Hi, [~kiszk] I met this failure also.
Is it possible to backport this to 2.1.1?
Appreciate it!

> Code generation for Filter predicate including many OR conditions exceeds JVM 
> method size limit 
> ------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19372
>                 URL: https://issues.apache.org/jira/browse/SPARK-19372
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.1.0
>            Reporter: Jay Pranavamurthi
>            Assignee: Kazuaki Ishizaki
>             Fix For: 2.2.0, 2.3.0
>
>         Attachments: wide400cols.csv
>
>
> For the attached csv file, the code below causes the exception 
> "org.codehaus.janino.JaninoRuntimeException: Code of method 
> "(Lorg/apache/spark/sql/catalyst/InternalRow;)Z" of class 
> "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate" 
> grows beyond 64 KB
> Code:
> {code:borderStyle=solid}
>   val conf = new SparkConf().setMaster("local[1]")
>   val sqlContext = 
> SparkSession.builder().config(conf).getOrCreate().sqlContext
>   val dataframe =
>     sqlContext
>       .read
>       .format("com.databricks.spark.csv")
>       .load("wide400cols.csv")
>   val filter = (0 to 399)
>     .foldLeft(lit(false))((e, index) => 
> e.or(dataframe.col(dataframe.columns(index)) =!= s"column${index+1}"))
>   val filtered = dataframe.filter(filter)
>   filtered.show(100)
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to