[ 
https://issues.apache.org/jira/browse/SPARK-6235?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15412727#comment-15412727
 ] 

Brian commented on SPARK-6235:
------------------------------

How is it possible that Spark 2.0 comes and and this bug isn't solved?  A quick 
Google search fort "Spark 2GB limit" or "Spark Integer.MAX_VALUE" shows that 
this is a very real problem that affects lots of users.  From the outside 
looking in, it seems like the Spark developers don't have an interest in 
solving this bug since it's been around for years at this point (including the 
jiras this consolidated ticket replaced).  Can you provide some sort of an 
update?  Maybe if you don't plan on fixing this issue, you can close the ticket 
or mark it as won't fix.  At least that way we'd have some insight in to your 
plans....Thanks!

> Address various 2G limits
> -------------------------
>
>                 Key: SPARK-6235
>                 URL: https://issues.apache.org/jira/browse/SPARK-6235
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Shuffle, Spark Core
>            Reporter: Reynold Xin
>
> An umbrella ticket to track the various 2G limit we have in Spark, due to the 
> use of byte arrays and ByteBuffers.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to