[jira] [Comment Edited] (SPARK-1476) 2GB limit in spark for blocks

2019-12-17 Thread Shirish (Jira)
[ https://issues.apache.org/jira/browse/SPARK-1476?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998752#comment-16998752 ] Shirish edited comment on SPARK-1476 at 12/18/19 2:38 AM: -- This

[jira] [Commented] (SPARK-1476) 2GB limit in spark for blocks

2019-12-17 Thread Shirish (Jira)
[ https://issues.apache.org/jira/browse/SPARK-1476?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998752#comment-16998752 ] Shirish commented on SPARK-1476: This is an old chain that I happen to land on to. I am i

[jira] [Commented] (SPARK-12837) Spark driver requires large memory space for serialized results even there are no data collected to the driver

2017-02-08 Thread Shirish (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-12837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15859086#comment-15859086 ] Shirish commented on SPARK-12837: - Any workaround guys till we fix this? > Spark driver

[jira] [Commented] (SPARK-12837) Spark driver requires large memory space for serialized results even there are no data collected to the driver

2017-01-17 Thread Shirish (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-12837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15826602#comment-15826602 ] Shirish commented on SPARK-12837: - I am using Spark 1.6 and I see this issue. I have an

[jira] [Commented] (SPARK-2620) case class cannot be used as key for reduce

2016-11-10 Thread Shirish (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-2620?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15654830#comment-15654830 ] Shirish commented on SPARK-2620: Has this been resolved in 1.6? I am facing similar issue