[ 
https://issues.apache.org/jira/browse/SPARK-19875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17238411#comment-17238411
 ] 

Asif edited comment on SPARK-19875 at 11/25/20, 6:02 PM:
---------------------------------------------------------

[~maropu], [~sameerag]  [~jay.pranavamurthi] I have generated a PR for 
SPARK-33152 which fixes the OOM or unreasonable compile time in queries.

The PR is [pr-for-spark-33152|https://github.com/apache/spark/pull/30185]

I cannot get any body for code review.

The explanation of the logic used is in the PR.

If needed we can go through the code together. This is going to be used by 
workday in production.


was (Author: ashahid7):
[~maropu], [~sameerag]  [~jay.pranavamurthi] I have generated a PR for 
SPARK-3152 which fixes the OOM or unreasonable compile time in queries.

The PR is [pr-for-spark-33152|https://github.com/apache/spark/pull/30185]

I cannot get any body for code review.

The explanation of the logic used is in the PR.

If needed we can go through the code together. This is going to be used by 
workday in production.

> Map->filter on many columns gets stuck in constraint inference optimization 
> code
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-19875
>                 URL: https://issues.apache.org/jira/browse/SPARK-19875
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Jay Pranavamurthi
>            Priority: Major
>              Labels: bulk-closed
>         Attachments: TestFilter.scala, test10cols.csv, test50cols.csv
>
>
> The attached code (TestFilter.scala) works with a 10-column csv dataset, but 
> gets stuck with a 50-column csv dataset. Both datasets are attached.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to