Is 1.6.1 going to be ready this week? I see that the two last unresolved
issues targeting 1.6.1 are fixed
<https://github.com/apache/spark/pull/11131> now
<https://github.com/apache/spark/pull/10539>.

On 3 February 2016 at 08:16, Daniel Darabos <
daniel.dara...@lynxanalytics.com> wrote:

>
> On Tue, Feb 2, 2016 at 7:10 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> What about the memory leak bug?
>>> https://issues.apache.org/jira/browse/SPARK-11293
>>> Even after the memory rewrite in 1.6.0, it still happens in some cases.
>>> Will it be fixed for 1.6.1?
>>>
>>
>> I think we have enough issues queued up that I would not hold the release
>> for that, but if there is a patch we should try and review it.  We can
>> always do 1.6.2 when more issues have been resolved.  Is this an actual
>> issue that is affecting a production workload or are we concerned about an
>> edge case?
>>
>
> The way we (Lynx Analytics) use RDDs, this affects almost everything we do
> in production. Thankfully it does not cause any issues, it just logs a lot
> of errors. I think the adverse effect may be that the memory manager does
> not have a fully correct picture. But as long as the leak fits in the
> "other" (unmanaged) memory fraction this will not cause issues. We don't
> see this as an urgent issue. Thanks!
>

Reply via email to