Re: [yocto] Safely cleaning 'downloads'

2015-10-02 Thread Laurentiu Palcu
Hi Garry,

It's probably a little late (I'm not reading the Yocto mailing lists
very often now) but sometime ago I ran out of space myself and I wrote
this script that I sent out on oe-core mailing list. It did the job for
me. For some reason it didn't make it in master but maybe you find it
useful.

I didn't test it since then though. Maybe things changed, I don't know.
Give it a try if you wish. If you want to play safe, I recommend making
a copy of your downloads directory (if you have space left). :)

http://lists.openembedded.org/pipermail/openembedded-core/2015-June/106026.html

laurentiu

On Wed, Sep 30, 2015 at 06:13:39AM -0600, Gary Thomas wrote:
> Over time, I tend to build in the same build tree many, many
> times.  This leads to some really big trees as many things are
> duplicated, especially in the 'downloads' directory.
> 
> I use local mirrors and hence my 'downloads' directory is
> _mostly_ populated with symbolic links.  However, there are
> also expanded SCM packages, e.g. git2/xxx
> 
> How can I safely clean up the 'downloads' directory?  I already
> copy any created tarballs (I use BB_GENERATE_MIRROR_TARBALLS="1"
> to preclude unneeded downloads) to my mirror, but I'd like to
> periodically clean out the whole directory (without disturbing
> my builds of course).  I've found out the hard way that just
> emptying seems to be unsafe, at least for some recipes like
> the [RaspberryPi] Linux kernel recipe which once built seems
> to expect the expanded git2/xxx tree to remain.
> 
> Just trying to find ways to recover my lost GB...
> 
> Thanks for any ideas
> 
> -- 
> 
> Gary Thomas |  Consulting for the
> MLB Associates  |Embedded world
> 
> -- 
> ___
> yocto mailing list
> yocto@yoctoproject.org
> https://lists.yoctoproject.org/listinfo/yocto
-- 
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto


Re: [yocto] Safely cleaning 'downloads'

2015-10-01 Thread Smith, Virgil
The following is roughly the procedure I follow and that works for me.  Maybe 
someone could chime in with how some of this should be trimmed based on 
yocto/bitbake intent/design.
Even so I'd probably stick with this level of extremism because without a known 
good backup of your downloads(sources) you may be incapable of (tweaking and) 
rebuilding your products if anything happens to your build server.

The only reason I've seen that simply deleting the downloads folder causes 
problems is that external servers/content go away, violate their git history, 
or replace files with non-identical contents.


Warning: The following does not maintain PR server information, so automatic 
upgrading of your own packages could break.  If you rely on this work out how 
to extract that information (and back it up regularly).

1. rename/move your current downloads folder and create a new one.
2. for all of your product build configurations empty out the following folders
2.1 cache
2.2 state-cache
2.3 tmp
3. build (bitbake) all your product images with all appropriate configuration 
variances
4. run the following command to extract the unexpanded sources from downloads
find -H downloads -maxdepth 1 \
 -not -type d   -and   -not -name "*.done" \
 -exec cp -L {} sources-tmp \;

You now have everything you *currently* need for a sources mirror in the 
sources-tmp folder.

5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
6. Check those contents into some form of revision control (even if that is 
just a manual set of backup folders/media).


Yes this is costs time and space, you just have to decide how much your images 
and how much being able to reproduce them (with or without 'small' changes) is 
worth.


> -Original Message-
> From: yocto-boun...@yoctoproject.org [mailto:yocto-
> boun...@yoctoproject.org] On Behalf Of Gary Thomas
> Sent: Wednesday, September 30, 2015 7:14 AM
> To: Yocto Project
> Subject: [yocto] Safely cleaning 'downloads'
>
> Over time, I tend to build in the same build tree many, many times.  This 
> leads to
> some really big trees as many things are duplicated, especially in the 
> 'downloads'
> directory.
>
> I use local mirrors and hence my 'downloads' directory is _mostly_ populated
> with symbolic links.  However, there are also expanded SCM packages, e.g.
> git2/xxx
>
> How can I safely clean up the 'downloads' directory?  I already copy any 
> created
> tarballs (I use BB_GENERATE_MIRROR_TARBALLS="1"
> to preclude unneeded downloads) to my mirror, but I'd like to periodically 
> clean
> out the whole directory (without disturbing my builds of course).  I've found 
> out
> the hard way that just emptying seems to be unsafe, at least for some recipes
> like the [RaspberryPi] Linux kernel recipe which once built seems to expect 
> the
> expanded git2/xxx tree to remain.
>
> Just trying to find ways to recover my lost GB...
>
> Thanks for any ideas
>
> --
> 
> Gary Thomas |  Consulting for the
> MLB Associates  |Embedded world
> 
> --
> ___
> yocto mailing list
> yocto@yoctoproject.org
> https://lists.yoctoproject.org/listinfo/yocto



Notice to recipient: This email is meant for only the intended recipient of the 
transmission, and may be a communication privileged by law, subject to export 
control restrictions or that otherwise contains proprietary information. If you 
receive this email by mistake, please notify us immediately by replying to this 
message and then destroy it and do not review, disclose, copy or distribute it. 
Thank you in advance for your cooperation.
-- 
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto


Re: [yocto] Safely cleaning 'downloads'

2015-10-01 Thread Christopher Larson
On Thu, Oct 1, 2015 at 9:49 AM, Gary Thomas  wrote:

> On 2015-10-01 10:38, Smith, Virgil wrote:
>
>> The following is roughly the procedure I follow and that works for me.
>> Maybe someone could chime in with how some of this should be trimmed based
>> on yocto/bitbake intent/design.
>> Even so I'd probably stick with this level of extremism because without a
>> known good backup of your downloads(sources) you may be incapable of
>> (tweaking and) rebuilding your products if anything happens to your build
>> server.
>>
>> The only reason I've seen that simply deleting the downloads folder
>> causes problems is that external servers/content go away, violate their git
>> history, or replace files with non-identical contents.
>>
>>
>> Warning: The following does not maintain PR server information, so
>> automatic upgrading of your own packages could break.  If you rely on this
>> work out how to extract that information (and back it up regularly).
>>
>> 1. rename/move your current downloads folder and create a new one.
>> 2. for all of your product build configurations empty out the following
>> folders
>> 2.1 cache
>> 2.2 state-cache
>> 2.3 tmp
>> 3. build (bitbake) all your product images with all appropriate
>> configuration variances
>> 4. run the following command to extract the unexpanded sources from
>> downloads
>> find -H downloads -maxdepth 1 \
>>   -not -type d   -and   -not -name "*.done" \
>>   -exec cp -L {} sources-tmp \;
>>
>> You now have everything you *currently* need for a sources mirror in the
>> sources-tmp folder.
>>
>> 5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
>> 6. Check those contents into some form of revision control (even if that
>> is just a manual set of backup folders/media).
>>
>>
>> Yes this is costs time and space, you just have to decide how much your
>> images and how much being able to reproduce them (with or without 'small'
>> changes) is worth.
>>
>
> I'm already doing more or less this same sequence.  I use these commands to
> stage the downloaded files to my mirror (/work/misc/Poky/sources for
> historical reasons)
>   ( cd downloads;
> find . -maxdepth 1 -type f | grep -v '.lock$' | grep -v '.done$'
> >/tmp/files.$$;
> rsync -auv --files-from=/tmp/files.$$ . /work/misc/Poky/sources
>   )
>
> This works very well (I've been doing it for many years).  The issue I'm
> trying
> to work on now is that my script leaves 'downloads' possibly full of
> files, especially
> if there are new versions that have just been downloaded.  This is
> especially noticeable
> for the tarballs of GIT trees - there are a number that I need/use that
> are measured in
> gigabytes (e.g. the RaspberryPi board firmware is 4194568645 bytes as of
> 2015-07-20!)
> Once I've saved these to my mirror(s), I'd like to be able to purge them
> from the local
> download directory in my builds.  As mentioned, I've found that just
> wiping that in a
> build tree tends to break things quite badly.  Of course I can always
> start over with
> a new build tree, but that also defeats the purpose of incremental builds.


I'd think something like this would get the job done:

1. Do a build of all your supported machines and configurations with
BB_GENERATE_MIRROR_TARBALLS=1 to ensure you have current, not out of date
scm tarballs.

2. Set up builds of all your supported machines and configurations, using a
new DL_DIR, with PREMIRRORS pointing to the old DL_DIR.

3. Either clean up the old DL_DIR by access time before you kicked off the
builds, or resolve the symlinks in the new DL_DIR and remove the old.
-- 
Christopher Larson
clarson at kergoth dot com
Founder - BitBake, OpenEmbedded, OpenZaurus
Maintainer - Tslib
Senior Software Engineer, Mentor Graphics
-- 
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto


Re: [yocto] Safely cleaning 'downloads'

2015-10-01 Thread Gary Thomas

On 2015-10-01 10:38, Smith, Virgil wrote:

The following is roughly the procedure I follow and that works for me.  Maybe 
someone could chime in with how some of this should be trimmed based on 
yocto/bitbake intent/design.
Even so I'd probably stick with this level of extremism because without a known 
good backup of your downloads(sources) you may be incapable of (tweaking and) 
rebuilding your products if anything happens to your build server.

The only reason I've seen that simply deleting the downloads folder causes 
problems is that external servers/content go away, violate their git history, 
or replace files with non-identical contents.


Warning: The following does not maintain PR server information, so automatic 
upgrading of your own packages could break.  If you rely on this work out how 
to extract that information (and back it up regularly).

1. rename/move your current downloads folder and create a new one.
2. for all of your product build configurations empty out the following folders
2.1 cache
2.2 state-cache
2.3 tmp
3. build (bitbake) all your product images with all appropriate configuration 
variances
4. run the following command to extract the unexpanded sources from downloads
find -H downloads -maxdepth 1 \
  -not -type d   -and   -not -name "*.done" \
  -exec cp -L {} sources-tmp \;

You now have everything you *currently* need for a sources mirror in the 
sources-tmp folder.

5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
6. Check those contents into some form of revision control (even if that is 
just a manual set of backup folders/media).


Yes this is costs time and space, you just have to decide how much your images 
and how much being able to reproduce them (with or without 'small' changes) is 
worth.


I'm already doing more or less this same sequence.  I use these commands to
stage the downloaded files to my mirror (/work/misc/Poky/sources for historical 
reasons)
  ( cd downloads;
find . -maxdepth 1 -type f | grep -v '.lock$' | grep -v '.done$' 
>/tmp/files.$$;
rsync -auv --files-from=/tmp/files.$$ . /work/misc/Poky/sources
  )

This works very well (I've been doing it for many years).  The issue I'm trying
to work on now is that my script leaves 'downloads' possibly full of files, 
especially
if there are new versions that have just been downloaded.  This is especially 
noticeable
for the tarballs of GIT trees - there are a number that I need/use that are 
measured in
gigabytes (e.g. the RaspberryPi board firmware is 4194568645 bytes as of 
2015-07-20!)
Once I've saved these to my mirror(s), I'd like to be able to purge them from 
the local
download directory in my builds.  As mentioned, I've found that just wiping 
that in a
build tree tends to break things quite badly.  Of course I can always start 
over with
a new build tree, but that also defeats the purpose of incremental builds.





-Original Message-
From: yocto-boun...@yoctoproject.org [mailto:yocto-
boun...@yoctoproject.org] On Behalf Of Gary Thomas
Sent: Wednesday, September 30, 2015 7:14 AM
To: Yocto Project
Subject: [yocto] Safely cleaning 'downloads'

Over time, I tend to build in the same build tree many, many times.  This leads 
to
some really big trees as many things are duplicated, especially in the 
'downloads'
directory.

I use local mirrors and hence my 'downloads' directory is _mostly_ populated
with symbolic links.  However, there are also expanded SCM packages, e.g.
git2/xxx

How can I safely clean up the 'downloads' directory?  I already copy any created
tarballs (I use BB_GENERATE_MIRROR_TARBALLS="1"
to preclude unneeded downloads) to my mirror, but I'd like to periodically clean
out the whole directory (without disturbing my builds of course).  I've found 
out
the hard way that just emptying seems to be unsafe, at least for some recipes
like the [RaspberryPi] Linux kernel recipe which once built seems to expect the
expanded git2/xxx tree to remain.

Just trying to find ways to recover my lost GB...

Thanks for any ideas

--

Gary Thomas |  Consulting for the
MLB Associates  |Embedded world

--
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto




Notice to recipient: This email is meant for only the intended recipient of the 
transmission, and may be a communication privileged by law, subject to export 
control restrictions or that otherwise contains proprietary information. If you 
receive this email by mistake, please notify us immediately by replying to this 
message and then destroy it and do not review, disclose, copy or distribute it. 
Thank you in advance for your cooperation.



--
-

Re: [yocto] Safely cleaning 'downloads'

2015-10-01 Thread Martin Jansa
On Thu, Oct 01, 2015 at 09:54:51AM -0700, Christopher Larson wrote:
> On Thu, Oct 1, 2015 at 9:49 AM, Gary Thomas  wrote:
> 
> > On 2015-10-01 10:38, Smith, Virgil wrote:
> >
> >> The following is roughly the procedure I follow and that works for me.
> >> Maybe someone could chime in with how some of this should be trimmed based
> >> on yocto/bitbake intent/design.
> >> Even so I'd probably stick with this level of extremism because without a
> >> known good backup of your downloads(sources) you may be incapable of
> >> (tweaking and) rebuilding your products if anything happens to your build
> >> server.
> >>
> >> The only reason I've seen that simply deleting the downloads folder
> >> causes problems is that external servers/content go away, violate their git
> >> history, or replace files with non-identical contents.
> >>
> >>
> >> Warning: The following does not maintain PR server information, so
> >> automatic upgrading of your own packages could break.  If you rely on this
> >> work out how to extract that information (and back it up regularly).
> >>
> >> 1. rename/move your current downloads folder and create a new one.
> >> 2. for all of your product build configurations empty out the following
> >> folders
> >> 2.1 cache
> >> 2.2 state-cache
> >> 2.3 tmp
> >> 3. build (bitbake) all your product images with all appropriate
> >> configuration variances
> >> 4. run the following command to extract the unexpanded sources from
> >> downloads
> >> find -H downloads -maxdepth 1 \
> >>   -not -type d   -and   -not -name "*.done" \
> >>   -exec cp -L {} sources-tmp \;
> >>
> >> You now have everything you *currently* need for a sources mirror in the
> >> sources-tmp folder.
> >>
> >> 5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
> >> 6. Check those contents into some form of revision control (even if that
> >> is just a manual set of backup folders/media).
> >>
> >>
> >> Yes this is costs time and space, you just have to decide how much your
> >> images and how much being able to reproduce them (with or without 'small'
> >> changes) is worth.
> >>
> >
> > I'm already doing more or less this same sequence.  I use these commands to
> > stage the downloaded files to my mirror (/work/misc/Poky/sources for
> > historical reasons)
> >   ( cd downloads;
> > find . -maxdepth 1 -type f | grep -v '.lock$' | grep -v '.done$'
> > >/tmp/files.$$;
> > rsync -auv --files-from=/tmp/files.$$ . /work/misc/Poky/sources
> >   )
> >
> > This works very well (I've been doing it for many years).  The issue I'm
> > trying
> > to work on now is that my script leaves 'downloads' possibly full of
> > files, especially
> > if there are new versions that have just been downloaded.  This is
> > especially noticeable
> > for the tarballs of GIT trees - there are a number that I need/use that
> > are measured in
> > gigabytes (e.g. the RaspberryPi board firmware is 4194568645 bytes as of
> > 2015-07-20!)
> > Once I've saved these to my mirror(s), I'd like to be able to purge them
> > from the local
> > download directory in my builds.  As mentioned, I've found that just
> > wiping that in a
> > build tree tends to break things quite badly.  Of course I can always
> > start over with
> > a new build tree, but that also defeats the purpose of incremental builds.
> 
> 
> I'd think something like this would get the job done:
> 
> 1. Do a build of all your supported machines and configurations with
> BB_GENERATE_MIRROR_TARBALLS=1 to ensure you have current, not out of date
> scm tarballs.
> 
> 2. Set up builds of all your supported machines and configurations, using a
> new DL_DIR, with PREMIRRORS pointing to the old DL_DIR.
> 
> 3. Either clean up the old DL_DIR by access time before you kicked off the
> builds, or resolve the symlinks in the new DL_DIR and remove the old.

I'm doing the same, but make sure not to re-use sstate in 2nd build,
otherwise many components can be re-used from sstate without the need to
download their sources.

It's easier to use fetchall task instead of actual build in 2nd step.

Similarly when doing the same to clean sstate-cache (if you don't trust
sstate-cache-management.sh) you can end with 2nd build completely built
from sstate, but sstate for many intermediate dependencies wasn't
accessed - you can almost build whole image just by reusing packagedata
sstate archives for all included packages, but once you modify one of
them then you'll need do_populate_sysroot archives for all it's
dependencies.

Regards,

-- 
Martin 'JaMa' Jansa jabber: martin.ja...@gmail.com


signature.asc
Description: Digital signature
-- 
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto


Re: [yocto] Safely cleaning 'downloads'

2015-10-01 Thread Gary Thomas

On 2015-10-01 10:54, Christopher Larson wrote:


On Thu, Oct 1, 2015 at 9:49 AM, Gary Thomas > wrote:

On 2015-10-01 10:38, Smith, Virgil wrote:

The following is roughly the procedure I follow and that works for me.  
Maybe someone could chime in with how some of this should be trimmed based on 
yocto/bitbake
intent/design.
Even so I'd probably stick with this level of extremism because without 
a known good backup of your downloads(sources) you may be incapable of 
(tweaking and) rebuilding
your products if anything happens to your build server.

The only reason I've seen that simply deleting the downloads folder 
causes problems is that external servers/content go away, violate their git 
history, or replace files
with non-identical contents.


Warning: The following does not maintain PR server information, so 
automatic upgrading of your own packages could break.  If you rely on this work 
out how to extract that
information (and back it up regularly).

1. rename/move your current downloads folder and create a new one.
2. for all of your product build configurations empty out the following 
folders
2.1 cache
2.2 state-cache
2.3 tmp
3. build (bitbake) all your product images with all appropriate 
configuration variances
4. run the following command to extract the unexpanded sources from 
downloads
find -H downloads -maxdepth 1 \
   -not -type d   -and   -not -name "*.done" \
   -exec cp -L {} sources-tmp \;

You now have everything you *currently* need for a sources mirror in 
the sources-tmp folder.

5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
6. Check those contents into some form of revision control (even if 
that is just a manual set of backup folders/media).


Yes this is costs time and space, you just have to decide how much your 
images and how much being able to reproduce them (with or without 'small' 
changes) is worth.


I'm already doing more or less this same sequence.  I use these commands to
stage the downloaded files to my mirror (/work/misc/Poky/sources for 
historical reasons)
   ( cd downloads;
 find . -maxdepth 1 -type f | grep -v '.lock$' | grep -v '.done$' 
>/tmp/files.$$;
 rsync -auv --files-from=/tmp/files.$$ . /work/misc/Poky/sources
   )

This works very well (I've been doing it for many years).  The issue I'm 
trying
to work on now is that my script leaves 'downloads' possibly full of files, 
especially
if there are new versions that have just been downloaded.  This is 
especially noticeable
for the tarballs of GIT trees - there are a number that I need/use that are 
measured in
gigabytes (e.g. the RaspberryPi board firmware is 4194568645 
 bytes as of 2015-07-20!)
Once I've saved these to my mirror(s), I'd like to be able to purge them 
from the local
download directory in my builds.  As mentioned, I've found that just wiping 
that in a
build tree tends to break things quite badly.  Of course I can always start 
over with
a new build tree, but that also defeats the purpose of incremental builds.


I'd think something like this would get the job done:

1. Do a build of all your supported machines and configurations with 
BB_GENERATE_MIRROR_TARBALLS=1 to ensure you have current, not out of date scm 
tarballs.

2. Set up builds of all your supported machines and configurations, using a new 
DL_DIR, with PREMIRRORS pointing to the old DL_DIR.

3. Either clean up the old DL_DIR by access time before you kicked off the 
builds, or resolve the symlinks in the new DL_DIR and remove the old.


Still not terribly different than what I'm doing - I already use
BB_GENERATE_MIRROR_TARBALLS which is what leads to the giant
tarballs in my downloads (and later mirror(s)).  All of that works
great.  I just want to be able to clean up my downloads directory
after a successful build and still be able to do an incremental
build in that tree.

Here's an example with more details:
  1. Set up a [virgin] build tree for some target.  This establishes
 use of premirrors, etc.  Also sets BB_GENERATE_MIRROR_TARBALLS so
 any new SCM packages will get saved.
  2. Build the desired images
  3. At this point, my 'downloads' directory has the new SCM (and other)
 files that were needed that were not already in the mirrors.
  4. Save any downloaded file updates (using my script/commands as above)
 to my mirror(s)
  5. Purge anything that is in 'downloads' that is now in the mirror(s)
  6. Sometime later, presumably after some package updates, metadata changes,
 etc, rebuild the same target images in this same tree - incremental 
rebuild.

I've been doing this sequence (except for step 5) for a long time and everything
works great, my only question is how to safely 

[yocto] Safely cleaning 'downloads'

2015-09-30 Thread Gary Thomas

Over time, I tend to build in the same build tree many, many
times.  This leads to some really big trees as many things are
duplicated, especially in the 'downloads' directory.

I use local mirrors and hence my 'downloads' directory is
_mostly_ populated with symbolic links.  However, there are
also expanded SCM packages, e.g. git2/xxx

How can I safely clean up the 'downloads' directory?  I already
copy any created tarballs (I use BB_GENERATE_MIRROR_TARBALLS="1"
to preclude unneeded downloads) to my mirror, but I'd like to
periodically clean out the whole directory (without disturbing
my builds of course).  I've found out the hard way that just
emptying seems to be unsafe, at least for some recipes like
the [RaspberryPi] Linux kernel recipe which once built seems
to expect the expanded git2/xxx tree to remain.

Just trying to find ways to recover my lost GB...

Thanks for any ideas

--

Gary Thomas |  Consulting for the
MLB Associates  |Embedded world

--
___
yocto mailing list
yocto@yoctoproject.org
https://lists.yoctoproject.org/listinfo/yocto