[jira] [Comment Edited] (SPARK-22510) Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit

2020-02-03 Thread Frederik Schreiber (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-22510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17028902#comment-17028902
 ] 

Frederik Schreiber edited comment on SPARK-22510 at 2/3/20 12:35 PM:
-

extract my code to an example and paste it into: SPARK-30711


was (Author: schreiber):
extract my code to an example and paste it into: 
https://issues.apache.org/jira/browse/SPARK-30711

> Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit 
> 
>
> Key: SPARK-22510
> URL: https://issues.apache.org/jira/browse/SPARK-22510
> Project: Spark
>  Issue Type: Umbrella
>  Components: SQL
>Affects Versions: 2.2.0
>Reporter: Xiao Li
>Assignee: Kazuaki Ishizaki
>Priority: Major
>  Labels: bulk-closed, releasenotes
>
> Codegen can throw an exception due to the 64KB JVM bytecode or 64K constant 
> pool entry limit.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-22510) Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit

2020-02-03 Thread Frederik Schreiber (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-22510?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17028763#comment-17028763
 ] 

Frederik Schreiber edited comment on SPARK-22510 at 2/3/20 8:29 AM:


Hi [~smilegator], [~kiszk]

we are using Spark 2.4.0 and currently having trouble with 64KB Exception. Our 
Dataframe has about 42 Columns, so we are wondering because all of these bugs 
are closed. Are there still known bugs which leads to that exception? Can this 
exception appear on complex queries/dataframes by design?

spark.sql.codegen.maxFields is set to 100

 

Are there some suggestions to avoid that error?

 

We although tried with Spark version 2.4.4 with same results.


was (Author: schreiber):
Hi [~smilegator], [~kiszk]

we are using Spark 2.4.0 and currently having trouble with 64KB Exception. Our 
Dataframe has about 42 Columns, so we are wondering because all of these bugs 
are closed. Are there still known bugs which leads to that exception? Can this 
exception appear on complex queries/dataframes by design?

spark.sql.codegen.maxFields is set to 100

 

Are there same suggestion to avoid that error?

 

We although tried with Spark version 2.4.4 with same results.

> Exceptions caused by 64KB JVM bytecode or 64K constant pool entry limit 
> 
>
> Key: SPARK-22510
> URL: https://issues.apache.org/jira/browse/SPARK-22510
> Project: Spark
>  Issue Type: Umbrella
>  Components: SQL
>Affects Versions: 2.2.0
>Reporter: Xiao Li
>Assignee: Kazuaki Ishizaki
>Priority: Major
>  Labels: bulk-closed, releasenotes
>
> Codegen can throw an exception due to the 64KB JVM bytecode or 64K constant 
> pool entry limit.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org