[ https://issues.apache.org/jira/browse/SPARK-25904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16679862#comment-16679862 ]
Apache Spark commented on SPARK-25904: -------------------------------------- User 'squito' has created a pull request for this issue: https://github.com/apache/spark/pull/22983 > Avoid allocating arrays too large for JVMs > ------------------------------------------ > > Key: SPARK-25904 > URL: https://issues.apache.org/jira/browse/SPARK-25904 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.4.0 > Reporter: Imran Rashid > Assignee: Imran Rashid > Priority: Major > Fix For: 2.4.1, 3.0.0 > > > In a few places spark can try to allocate arrays as big as {{Int.MaxValue}}, > but thats actually too big for the JVM. We should consistently use > {{ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH}} instead. > In some cases this is changing defaults for configs, in some cases its bounds > on a config, and others its just improving error msgs for things that still > won't work. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org