with the current branch-2.1 after rc1 i am now also seeing this error in
our unit tests:

 java.lang.UnsupportedOperationException: Cannot create encoder for Option
of Product type, because Product type is represented as a row, and the
entire row can not be null in Spark SQL like normal databases. You can wrap
your type with Tuple1 if you do want top level null Product objects, e.g.
instead of creating `Dataset[Option[MyClass]]`, you can do something like
`val ds: Dataset[Tuple1[MyClass]] = Seq(Tuple1(MyClass(...)),
Tuple1(null)).toDS`

the issue is that we have Aggregator[String, Option[SomeCaseClass], String]
and it doesn't like creating the Encoder for that Option[SameCaseClass]
anymore.

this is related to SPARK-18251
<https://issues.apache.org/jira/browse/SPARK-18251>
we have a workaround for this: we will wrap all buffer encoder types in
Tuple1. a little inefficient but its okay with me.

On Sun, Dec 4, 2016 at 11:16 PM, Koert Kuipers <ko...@tresata.com> wrote:

> somewhere between rc1 and the current head of branch-2.1 i started seeing
> an NPE in our in-house unit tests for Dataset + Aggregator. i created
> SPARK-18711 <https://issues.apache.org/jira/browse/SPARK-18711> for this.
> <https://issues.apache.org/jira/browse/SPARK-18711>
>
> On Mon, Nov 28, 2016 at 8:25 PM, Reynold Xin <r...@databricks.com> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.0. The vote is open until Thursday, December 1, 2016 at 18:00 UTC and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.0
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v2.1.0-rc1 (80aabc0bd33dc5661a90133156247
>> e7a8c1bf7f5)
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1216/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-docs/
>>
>>
>> =======================================
>> How can I help test this release?
>> =======================================
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> ===============================================================
>> What should happen to JIRA tickets still targeting 2.1.0?
>> ===============================================================
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.1 or 2.2.0.
>>
>>
>>
>

Reply via email to