+1 for 0.10.0 RC4.

Bests,
Dongjoon.

On Wed, Nov 4, 2020 at 7:17 PM Jingsong Li <jingsongl...@gmail.com> wrote:

> +1
>
> 1. Download the source tarball, signature (.asc), and checksum (.sha512):
>  OK
> 2. Import gpg keys: download KEYS and run gpg --import
> /path/to/downloaded/KEYS (optional if this hasn’t changed) :  OK
> 3. Verify the signature by running: gpg --verify
> apache-iceberg-xx.tar.gz.asc:  OK
> 4. Verify the checksum by running: sha512sum -c
> apache-iceberg-xx.tar.gz.sha512 :  OK
> 5. Untar the archive and go into the source directory: tar xzf
> apache-iceberg-xx.tar.gz && cd apache-iceberg-xx:  OK
> 6. Run RAT checks to validate license headers: dev/check-license: OK
> 7. Build and test the project: ./gradlew build (use Java 8) :   OK
>
> Best,
> Jingsong
>
> On Thu, Nov 5, 2020 at 7:38 AM Ryan Blue <rb...@netflix.com.invalid>
> wrote:
>
>> +1
>>
>>    - Validated checksum and signature
>>    - Ran license checks
>>    - Built and ran tests
>>    - Queried a Hadoop FS table created with 0.9.0 in Spark 3.0.1
>>    - Created a Hive table from Spark 3.0.1
>>    - Tested metadata tables from Spark
>>    - Tested Hive and Hadoop table reads in Hive 2.3.7
>>
>> I was able to read both Hadoop and Hive tables created in Spark from Hive
>> using:
>>
>> add jar /home/blue/Downloads/iceberg-hive-runtime-0.10.0.jar;
>> create external table hadoop_table
>>   stored by 'org.apache.iceberg.mr.hive.HiveIcebergStorageHandler'
>>   location 'file:/home/blue/tmp/hadoop-warehouse/default/test';
>> select * from hadoop_table;
>>
>> set iceberg.mr.catalog=hive;
>> select * from hive_table;
>>
>> The hive_table needed engine.hive.enabled=true set in table properties
>> by Spark using:
>>
>> alter table hive_table set tblproperties ('engine.hive.enabled'='true')
>>
>> Hive couldn’t read the #snapshots metadata table for Hadoop. It failed
>> with this error:
>>
>> Failed with exception 
>> java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: 
>> java.lang.ClassCastException: java.lang.Long cannot be cast to 
>> java.time.OffsetDateTime
>>
>> I also couldn’t read the Hadoop table once iceberg.mr.catalog was set in
>> my environment, so I think we have a bit more work to do to clean up Hive
>> table configuration.
>>
>> On Wed, Nov 4, 2020 at 12:54 AM Ryan Murray <rym...@dremio.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> 1. Download the source tarball, signature (.asc), and checksum
>>> (.sha512):   OK
>>> 2. Import gpg keys: download KEYS and run gpg --import
>>> /path/to/downloaded/KEYS (optional if this hasn’t changed) :  OK
>>> 3. Verify the signature by running: gpg --verify
>>> apache-iceberg-xx.tar.gz.asc:  I got a warning "gpg: WARNING: This key is
>>> not certified with a trusted signature! gpg:          There is no
>>> indication that the signature belongs to the owner." but it passed
>>> 4. Verify the checksum by running: sha512sum -c
>>> apache-iceberg-xx.tar.gz.sha512 :  OK
>>> 5. Untar the archive and go into the source directory: tar xzf
>>> apache-iceberg-xx.tar.gz && cd apache-iceberg-xx:  OK
>>> 6. Run RAT checks to validate license headers: dev/check-license: OK
>>> 7. Build and test the project: ./gradlew build (use Java 8 & Java 11) :
>>>  OK
>>>
>>>
>>> On Wed, Nov 4, 2020 at 2:56 AM OpenInx <open...@gmail.com> wrote:
>>>
>>>> +1 for 0.10.0 RC4
>>>>
>>>> 1. Download the source tarball, signature (.asc), and checksum
>>>> (.sha512):   OK
>>>> 2. Import gpg keys: download KEYS and run gpg --import
>>>> /path/to/downloaded/KEYS (optional if this hasn’t changed) :  OK
>>>> 3. Verify the signature by running: gpg --verify
>>>> apache-iceberg-xx.tar.gz.asc:  OK
>>>> 4. Verify the checksum by running: sha512sum -c
>>>> apache-iceberg-xx.tar.gz.sha512 :  OK
>>>> 5. Untar the archive and go into the source directory: tar xzf
>>>> apache-iceberg-xx.tar.gz && cd apache-iceberg-xx:  OK
>>>> 6. Run RAT checks to validate license headers: dev/check-license: OK
>>>> 7. Build and test the project: ./gradlew build (use Java 8) :   OK
>>>>
>>>> On Wed, Nov 4, 2020 at 8:25 AM Anton Okolnychyi
>>>> <aokolnyc...@apple.com.invalid> wrote:
>>>>
>>>>> Hi everyone,
>>>>>
>>>>> I propose the following RC to be released as official Apache Iceberg
>>>>> 0.10.0 release.
>>>>>
>>>>> The commit id is d39fad00b7dded98121368309f381473ec21e85f
>>>>> * This corresponds to the tag: apache-iceberg-0.10.0-rc4
>>>>> * https://github.com/apache/iceberg/commits/apache-iceberg-0.10.0-rc4
>>>>> *
>>>>> https://github.com/apache/iceberg/tree/d39fad00b7dded98121368309f381473ec21e85f
>>>>>
>>>>> The release tarball, signature, and checksums are here:
>>>>> *
>>>>> https://dist.apache.org/repos/dist/dev/iceberg/apache-iceberg-0.10.0-rc4/
>>>>>
>>>>> You can find the KEYS file here (make sure to import the new key that
>>>>> was used to sign the release):
>>>>> * https://dist.apache.org/repos/dist/dev/iceberg/KEYS
>>>>>
>>>>> Convenience binary artifacts are staged in Nexus. The Maven repository
>>>>> URL is:
>>>>> *
>>>>> https://repository.apache.org/content/repositories/orgapacheiceberg-1012
>>>>>
>>>>> This release includes important changes:
>>>>>
>>>>> * Flink support
>>>>> * Hive read support
>>>>> * ORC support fixes and improvements
>>>>> * Application of row-level delete files on read
>>>>> * Snapshot partition summary
>>>>> * Ability to load LocationProvider dynamically
>>>>> * Sort spec
>>>>>
>>>>> Please download, verify, and test.
>>>>>
>>>>> Please vote in the next 72 hours.
>>>>>
>>>>> [ ] +1 Release this as Apache Iceberg 0.10.0
>>>>> [ ] +0
>>>>> [ ] -1 Do not release this because…
>>>>>
>>>>> Thanks,
>>>>> Anton
>>>>>
>>>>
>>
>> --
>> Ryan Blue
>> Software Engineer
>> Netflix
>>
>
>
> --
> Best, Jingsong Lee
>

Reply via email to