Hi Michael,
What about the memory leak bug?
https://issues.apache.org/jira/browse/SPARK-11293
Even after the memory rewrite in 1.6.0, it still happens in some cases.
Will it be fixed for 1.6.1?
Thanks,

*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com

On Mon, Feb 1, 2016 at 9:59 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> We typically do not allow changes to the classpath in maintenance releases.
>
> On Mon, Feb 1, 2016 at 8:16 AM, Hamel Kothari <hamelkoth...@gmail.com>
> wrote:
>
>> I noticed that the Jackson dependency was bumped to 2.5 in master for
>> something spark-streaming related. Is there any reason that this upgrade
>> can't be included with 1.6.1?
>>
>> According to later comments on this thread:
>> https://issues.apache.org/jira/browse/SPARK-8332 and my personal
>> experience using with Spark with Jackson 2.5 hasn't caused any issues but
>> it does have some useful new features. It should be fully backwards
>> compatible according to the Jackson folks.
>>
>> On Mon, Feb 1, 2016 at 10:29 AM Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> SPARK-12624 has been resolved.
>>> According to Wenchen, SPARK-12783 is fixed in 1.6.0 release.
>>>
>>> Are there other blockers for Spark 1.6.1 ?
>>>
>>> Thanks
>>>
>>> On Wed, Jan 13, 2016 at 5:39 PM, Michael Armbrust <
>>> mich...@databricks.com> wrote:
>>>
>>>> Hey All,
>>>>
>>>> While I'm not aware of any critical issues with 1.6.0, there are
>>>> several corner cases that users are hitting with the Dataset API that are
>>>> fixed in branch-1.6.  As such I'm considering a 1.6.1 release.
>>>>
>>>> At the moment there are only two critical issues targeted for 1.6.1:
>>>>  - SPARK-12624 - When schema is specified, we should treat undeclared
>>>> fields as null (in Python)
>>>>  - SPARK-12783 - Dataset map serialization error
>>>>
>>>> When these are resolved I'll likely begin the release process.  If
>>>> there are any other issues that we should wait for please contact me.
>>>>
>>>> Michael
>>>>
>>>
>>>
>

Reply via email to