+1 (non-binding)

License checks, various smoke tests for create table, update, merge into,
deletes, etc against Java 11 and Spark 3.2 and 3.1.

- Kyle Bendickson

On Mon, Feb 14, 2022 at 12:32 PM Ryan Blue <[email protected]> wrote:

> +1 (binding)
>
> * Ran license checks, verified checksum and signature
> * Built the project
>
> Thanks, Amogh and Jack for managing this release!
>
> On Sun, Feb 13, 2022 at 10:22 PM Jack Ye <[email protected]> wrote:
>
>> +1 (binding)
>>
>> verified signature, checksum, license. The checksum was generated using
>> the old buggy release script because it was executed in the 0.13.x branch
>> so it still used the full file path. I have updated it to use the relative
>> file path. In case anyone sees checksum failure, please re-download the
>> checksum file and verify again.
>>
>> Ran unit tests for all engine versions and JDK versions, AWS Integration
>> tests. For the Spark flaky test, given #4033 fixes the issue and it was not
>> a bug of the source code, I think we can continue without re-cut a
>> candidate.
>>
>> Tested basic operations, copy-on-write delete, update and rewrite data
>> files on AWS EMR Spark 3.1 Flink 1.14 and verified fixes #3986 and #4024.
>>
>> I did some basic tests for #4023 (the predicate pushdown fix) but I don't
>> have a large Spark 3.2 installation to further verify the performance. It
>> would be great if anyone else could do some additional verifications.
>>
>> Best,
>> Jack Ye
>>
>> On Fri, Feb 11, 2022 at 8:24 PM Manong Karl <[email protected]> wrote:
>>
>>> It's  flaky. This exception is only found in one agent of TeamCity.
>>> Changing agents will resolve the issue.
>>>
>>> Ryan Blue <[email protected]> 于2022年2月12日周六 08:57写道:
>>>
>>>> Does that exception fail consistently, or is it a flaky test? We
>>>> recently fixed another Spark test that was flaky because of sampling and
>>>> sort order: https://github.com/apache/iceberg/pull/4033
>>>>
>>>> On Thu, Feb 10, 2022 at 7:12 PM Manong Karl <[email protected]>
>>>> wrote:
>>>>
>>>>> I got an issue failed on spark 3.2
>>>>> TestMergeOnReadDelete.testDeleteWithSerializableIsolation[catalogName =
>>>>> testhive, implementation = org.apache.iceberg.spark.SparkCatalog, config =
>>>>> {type=hive, default-namespace=default}, format = orc, vectorized = true,
>>>>> distributionMode = none] · Issue #4090 · apache/iceberg (github.com)
>>>>> <https://github.com/apache/iceberg/issues/4090>.
>>>>> Is it just my exception?
>>>>>
>>>>
>>>>
>>>> --
>>>> Ryan Blue
>>>> Tabular
>>>>
>>>
>
> --
> Ryan Blue
> Tabular
>

Reply via email to