On Tue, 2018-09-25 at 12:58 -0700, Zac Medico wrote:
> On 09/25/2018 12:55 PM, Zac Medico wrote:
> > On 09/25/2018 12:18 PM, Michał Górny wrote:
> > > Fix crash due to race condition in handling the same file being present
> > > both in compressed and uncompressed variants.  If that is the case,
> > > just queue the uncompressed variant for compression, and ignore
> > > the other compressed variants.
> > > 
> > > Bug: https://bugs.gentoo.org/667072
> > > Signed-off-by: Michał Górny <mgo...@gentoo.org>
> > > ---
> > >  bin/ecompress | 7 +++++++
> > >  1 file changed, 7 insertions(+)
> > > 
> > > diff --git a/bin/ecompress b/bin/ecompress
> > > index 36bdb585b..d5ff3796c 100755
> > > --- a/bin/ecompress
> > > +++ b/bin/ecompress
> > > @@ -49,6 +49,13 @@ while [[ $# -gt 0 ]] ; do
> > >                           find_args+=( -size 
> > > "+${PORTAGE_DOCOMPRESS_SIZE_LIMIT}c" )
> > >  
> > >                   while IFS= read -d '' -r path; do
> > > +                         # if both compressed and uncompressed variant 
> > > exists,
> > > +                         # skip the compressed variants (bug #667072)
> > > +                         case ${path} in
> > > +                                 *.Z|*.gz|*.bz2|*.lzma|*.xz)
> > > +                                         [[ -s ${path%.*} ]] && continue
> > > +                                         ;;
> > > +                         esac
> > 
> > In theory, we'd still have a problem if the file existed with muliple
> > compressions, right? Maybe a good solution is to strip the compression
> > extension here, and then have ecompress-file check for duplicates?
> 
> Alternatively, we could uncompress the pre-compressed files right here.

That would cause them to be decompressed even if the path is eventually
exclude via 'docompress -x'.

-- 
Best regards,
Michał Górny

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to