I am pleased to announce that Hui Fei has accepted the invitation to become
a Hadoop committer.

He started contributing to the project in October 2016. Over the past 4
years he has contributed a lot in HDFS, especially in Erasure Coding,
Hadoop 3 upgrade, RBF and Standby Serving reads.

One of the biggest contributions is Hadoop 2->3 rolling upgrade support.
This was a major blocker for any existing Hadoop users to adopt Hadoop 3.
The adoption of Hadoop 3 has gone up after this. In the past the community
discussed a lot about Hadoop 3 rolling upgrade being a must-have, but no
one took the initiative to make it happen. I am personally very grateful
for this.

The work on EC is impressive as well. He managed to onboard EC in
production at scale, fixing tricky problems. Again, I am impressed and
grateful for the contribution in EC.

In addition to code contributions, he invested a lot in the community:

>
>    - Apache Hadoop Community 2019 Beijing Meetup
>    https://blogs.apache.org/hadoop/entry/hadoop-community-meetup-beijing-aug 
> where
>    he discussed the operational experience of RBF in production
>
>
>    - Apache Hadoop Storage Community Sync Online
>    
> https://docs.google.com/document/d/1jXM5Ujvf-zhcyw_5kiQVx6g-HeKe-YGnFS_1-qFXomI/edit#heading=h.irqxw1iy16zo
>  where
>    he discussed the Hadoop 3 rolling upgrade support
>
>
Let's congratulate Hui for this new role!

Cheers,
Wei-Chiu Chuang (on behalf of the Apache Hadoop PMC)

Reply via email to