You can check all the versions where the fix is available on the
JIRA SPARK-23376. Anyway it will be available in the upcoming 2.3.0 release.

Thanks.

On 13 Feb 2018 9:09 a.m., "SNEHASISH DUTTA" <info.snehas...@gmail.com>
wrote:

> Hi,
>
> In which version of Spark will this fix  be available ?
> The deployment is on EMR
>
> Regards,
> Snehasish
>
> On Fri, Feb 9, 2018 at 8:51 PM, Wenchen Fan <cloud0...@gmail.com> wrote:
>
>> It should be fixed by https://github.com/apache/spark/pull/20561 soon.
>>
>> On Fri, Feb 9, 2018 at 6:16 PM, Wenchen Fan <cloud0...@gmail.com> wrote:
>>
>>> This has been reported before: http://apache-spark-de
>>> velopers-list.1001551.n3.nabble.com/java-lang-IllegalStateEx
>>> ception-There-is-no-space-for-new-record-tc20108.html
>>>
>>> I think we may have a real bug here, but we need a reproduce. Can you
>>> provide one? thanks!
>>>
>>> On Fri, Feb 9, 2018 at 5:59 PM, SNEHASISH DUTTA <
>>> info.snehas...@gmail.com> wrote:
>>>
>>>> Hi ,
>>>>
>>>> I am facing the following when running on EMR
>>>>
>>>> Caused by: java.lang.IllegalStateException: There is no space for new
>>>> record
>>>>         at org.apache.spark.util.collection.unsafe.sort.UnsafeInMemoryS
>>>> orter.insertRecord(UnsafeInMemorySorter.java:226)
>>>>         at org.apache.spark.sql.execution.UnsafeKVExternalSorter.<init>
>>>> (UnsafeKVExternalSorter.java:132)
>>>>         at org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMa
>>>> p.destructAndCreateExternalSorter(UnsafeFixedWidthAggregatio
>>>> nMap.java:250)
>>>>
>>>> I am using pyspark 2.2 , what spark configuration should be
>>>> changed/modified to get this resolved
>>>>
>>>>
>>>> Regards,
>>>> Snehasish
>>>>
>>>>
>>>> Regards,
>>>> Snehasish
>>>>
>>>> On Fri, Feb 9, 2018 at 1:26 PM, SNEHASISH DUTTA <
>>>> info.snehas...@gmail.com> wrote:
>>>>
>>>>> Hi ,
>>>>>
>>>>> I am facing the following when running on EMR
>>>>>
>>>>> Caused by: java.lang.IllegalStateException: There is no space for new
>>>>> record
>>>>>         at org.apache.spark.util.collecti
>>>>> on.unsafe.sort.UnsafeInMemorySorter.insertRecord(UnsafeInMem
>>>>> orySorter.java:226)
>>>>>         at org.apache.spark.sql.execution
>>>>> .UnsafeKVExternalSorter.<init>(UnsafeKVExternalSorter.java:132)
>>>>>         at org.apache.spark.sql.execution
>>>>> .UnsafeFixedWidthAggregationMap.destructAndCreateExternalSor
>>>>> ter(UnsafeFixedWidthAggregationMap.java:250)
>>>>>
>>>>> I am using spark 2.2 , what spark configuration should be
>>>>> changed/modified to get this resolved
>>>>>
>>>>>
>>>>> Regards,
>>>>> Snehasish
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to