See https://issues.apache.org/jira/browse/SPARK-18557
<https://issues.apache.org/jira/browse/SPARK-18557>

On Mon, Nov 21, 2016 at 1:16 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> I'm also curious about this. Is there something we can do to help
> troubleshoot these leaks and file useful bug reports?
>
> On Wed, Oct 12, 2016 at 4:33 PM vonnagy <i...@vadio.com> wrote:
>
>> I am getting excessive memory leak warnings when running multiple mapping
>> and
>> aggregations and using DataSets. Is there anything I should be looking for
>> to resolve this or is this a known issue?
>>
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from
>> org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak a page:
>> org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341
>> WARN  [Executor task launch worker-0]
>> org.apache.spark.memory.TaskMemoryManager - leak a page:
>> org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341
>> WARN  [Executor task launch worker-0] org.apache.spark.executor.Executor
>> -
>> Managed memory leak detected; size = 17039360 bytes, TID = 88341
>>
>> Thanks,
>>
>> Ivan
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-
>> developers-list.1001551.n3.nabble.com/Memory-leak-
>> warnings-in-Spark-2-0-1-tp19424.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to