On 11/17/2012 09:10 AM, Dmitrijs Ledkovs wrote:
On 17 November 2012 18:33, Enrico Weigelt <enrico.weig...@vnc.biz> wrote:
Hi folks,

I'm regularily building quite huge packages with large dependencies,
eg. libreoffice, using git-buildpackage. And it's really slow.

Is there any way for speeding up the builds ?

I'm already using cowbuilder, but it only seems to be able to use
an existing base system tree, while still needs installing all
the dependencies one by one.

Is it possible to do some similar logic with dependencies ?
(something like an tweaked dpkg that fetches everything from
per-package directories instead *.dpkg files and just hardlink
instead of copying) ?

* use eatmydata
* use local caching proxy (apt-cacher-ng)

eatmydata - reduces IO by faking fsync which speeds up dpkg install a
lot (note this may result in e.g. test-suite failures which rely on
fsync)

apt-cacher-ng starts a local proxy on your machine, which can be used
as an apt-proxy or even as a "full" mirror, if it doesn't have
packages cached it simply gets them over the network. For a common set
of regular builds that greatly speeds up things.

use sbuild, it's faster. there is a handy mk-sbuild utility in
ubuntu-dev-tools that can create schroots for you (it even has a handy
eatmydata option).

you either want a clean environment, or you don't ;-) so you do have
to pay for a clean room.

Regards,

Dmitrijs.

If you're routinely rebuilding packages, you may see some benefit in using ccache as well. There are some instructions in the sbuild page on how to utilise it: http://wiki.debian.org/sbuild

Paul

--
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss

Reply via email to