Le 21 juil. 2014 à 19:14, kp kirchdoerfer <kap...@users.sourceforge.net> a 
écrit :

> Am Montag, 21. Juli 2014, 13:36:51 schrieb Yves Blusseau:
>> Le 21 juil. 2014 à 11:17, kp kirchdoerfer <kap...@users.sourceforge.net> a 
> écrit :
>>> Hi Gents;
>>> 
>>> Am Montag, 21. Juli 2014, 10:22:55 schrieb Yves Blusseau:
>>>> Le 21 juil. 2014 à 09:51, Andrew <ni...@seti.kr.ua> a écrit :
>>>>> 21.07.2014 09:48, Yves Blusseau пишет:
>>>>>> Le 20 juil. 2014 à 21:42, Andrew <ni...@seti.kr.ua> a écrit :
>>>>>>> 20.07.2014 21:12, Yves Blusseau ?????:
>>>>>>>> Le 20 juil. 2014 à 19:39, kp kirchdoerfer
>>> 
>>> <kap...@users.sourceforge.net> a écrit :
>>>>>>>>> Am Sonntag, 20. Juli 2014, 19:22:18 schrieb Yves Blusseau:
>>>>>>>>>> Le 20 juil. 2014 à 19:04, kp kirchdoerfer
>>>>>>>>>> <kap...@users.sourceforge.net> a
>>>>>>>>> 
>>>>>>>>> écrit :
>>>>>>>>>>> Am Sonntag, 20. Juli 2014, 18:40:15 schrieb Yves Blusseau:
>>>>>>>>>>>> Le 20 juil. 2014 à 18:22, kp kirchdoerfer
>>>>>>>>>>>> <kap...@users.sourceforge.net>
>>>>>>>>>>>> a
>>>>>>>>>>> 
>>>>>>>>>>> écrit :
>>>>>>>>>>>>> Hi Yves;
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Am Sonntag, 20. Juli 2014, 18:09:00 schrieb Yves Blusseau:
>>>>>>>>>>>>>> Hi all,
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> actually the size of the git repository (i'm speaking of the
>>>>>>>>>>>>>> "database"
>>>>>>>>>>>>>> not
>>>>>>>>>>>>>> the checkout) is about 3.5GB. It's really too big. The problem
>>>>>>>>>>>>>> is
>>>>>>>>>>>>>> that
>>>>>>>>>>>>>> the
>>>>>>>>>>>>>> "database" contain ALL the versions of "tar source files". So
>>>>>>>>>>>>>> if
>>>>>>>>>>>>>> someone
>>>>>>>>>>>>>> need to clone our repository, he need to download at least
>>>>>>>>>>>>>> 3.5GB
>>>>>>>>>>>>>> of
>>>>>>>>>>>>>> data.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> What i propose is to remove all the external source tar files
>>>>>>>>>>>>>> from the
>>>>>>>>>>>>>> history and put them (at least the last one) in another git
>>>>>>>>>>>>>> repository
>>>>>>>>>>>>>> on
>>>>>>>>>>>>>> SF. The source tar files will be download (if needed) from this
>>>>>>>>>>>>>> new git
>>>>>>>>>>>>>> repository (using the http protocol). With this, the
>>>>>>>>>>>>>> bering-uclibc will
>>>>>>>>>>>>>> contain only textfiles and some patch, and i think it will take
>>>>>>>>>>>>>> only
>>>>>>>>>>>>>> some
>>>>>>>>>>>>>> MB. We will continue to create a sources.tgz file when we
>>>>>>>>>>>>>> release
>>>>>>>>>>>>>> a new
>>>>>>>>>>>>>> version to meet the requirements of SF.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> What do you think about this idea ? If you are ok, i can made
>>>>>>>>>>>>>> the
>>>>>>>>>>>>>> process
>>>>>>>>>>>>>> to "clean" the repository and update the buildtool.cfg files to
>>>>>>>>>>>>>> change
>>>>>>>>>>>>>> the repo from local to SF.
>>>>>>>>>>>>> 
>>>>>>>>>>>>> The way I work is to copy the current repo to a local directory
>>>>>>>>>>>>> and use
>>>>>>>>>>>>> as main server
>>>>>>>>>>>>> 
>>>>>>>>>>>>> <Server localrepo>
>>>>>>>>>>>>> 
>>>>>>>>>>>>>    Type = filesymlnk
>>>>>>>>>>>>>    Serverpath = repo
>>>>>>>>>>>>> 
>>>>>>>>>>>>> </Server>
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Whatever I do in the local directory it won't clash with changes
>>>>>>>>>>>>> in git
>>>>>>>>>>>>> vice versa I will not crash git :)
>>>>>>>>>>>>> 
>>>>>>>>>>>>> So if we do have two repos (one for the sources, one for the
>>>>>>>>>>>>> buildtool.*
>>>>>>>>>>>>> and patches etc) I prefer that buildtool will be able to still
>>>>>>>>>>>>> use
>>>>>>>>>>>>> local
>>>>>>>>>>>>> directories (in the first place) and only download from SF if
>>>>>>>>>>>>> sources
>>>>>>>>>>>>> are
>>>>>>>>>>>>> missing.
>>>>>>>>>>>> 
>>>>>>>>>>>> The simplest is to declare where to get the tar files with
>>>>>>>>>>>> something
>>>>>>>>>>>> like:
>>>>>>>>>>>> <Server sourceforge>
>>>>>>>>>>>> 
>>>>>>>>>>>>    type = http
>>>>>>>>>>>>    Serverpath= xxxx
>>>>>>>>>>>> 
>>>>>>>>>>>> </Server>
>>>>>>>>>>>> And the buildtool will download the file only if it's not already
>>>>>>>>>>>> download
>>>>>>>>>>>> in the repo directory. So if the tar files are already in the
>>>>>>>>>>>> repo
>>>>>>>>>>>> directory you can build all the lrp offline.
>>>>>>>>>>> 
>>>>>>>>>>> That sounds good.
>>>>>>>>>>> 
>>>>>>>>>>>> For the source.tgz you only
>>>>>>>>>>>> have to change the definition of Server sourceforge to <Server
>>>>>>>>>>>> sourceforge>
>>>>>>>>>>>> 
>>>>>>>>>>>>    Type = filesymlnk
>>>>>>>>>>>>    Serverpath = repo
>>>>>>>>>>>> 
>>>>>>>>>>>> </Server>
>>>>>>>>>>>> because all the files are already in the sources.tgz
>>>>>>>>>>> 
>>>>>>>>>>> Just to understand:
>>>>>>>>>>> 
>>>>>>>>>>> Is "sources.tgz" meant as example like "linux-3.10.47.tar.gz" or
>>>>>>>>>>> do
>>>>>>>>>>> you
>>>>>>>>>>> refer to a tgz file that conatins *all* sources in one file?
>>>>>>>>>> 
>>>>>>>>>> About "sources.tgz" i refer to a tgz file that contain "all"
>>>>>>>>>> sources
>>>>>>>>>> in one
>>>>>>>>>> file.
>>>>>>>>> 
>>>>>>>>> And how do we maintain the file, if we only upgrade one source like
>>>>>>>>> the kernel? Repackage sources.tgz and upload a huge file?
>>>>>>>> 
>>>>>>>> Yes if it is a requirement for SF
>>>>>>> 
>>>>>>> Uploading 600+MB on every minor change is ugly IMHO... And it'll cause
>>>>>>> a
>>>>>>> headache to make changes for multiple branches.
>>>>>> 
>>>>>> It’s the case actually (we are speaking about the big tgz file that
>>>>>> must
>>>>>> be proposed on the files area to meet the SF requirements)
>>>>>> 
>>>>>> About the repo tar files, i think it will be better to download them
>>>>>> directly from the original sites, as now, all the big sites are
>>>>>> redundant and support many mirror. So we don’t have to commit them in
>>>>>> the SF repository. Only the files that is difficult to find or download
>>>>>> can be put to the SF repository.
>>>>>> 
>>>>>> Regards,
>>>>>> Yves
>>>>> 
>>>>> Same scheme was earlier. But some of sources were updated and old
>>>>> versions
>>>>> were replaced by new ones, some of sources were just lost on it’s
>>>>> hosting
>>>>> crash (for ex., kernel 2.6.35.14)…
>>>> 
>>>> But it will be easier that to commit each source tar file that changed on
>>>> SF. Also we have the big sources.tgz file that contain the tar files, so
>>>> it
>>>> can be use as a backup if the source tar file is not available anymore ?
>>> 
>>> I believe we do want to achieve three goals:
>>> 
>>> 1) keep the git repo for our build environment small
>>> 
>>> 2)  allow builds with sources from a local repo/directory to speed up the
>>> build process and allow working off-line
>>> 
>>> 3)  to provide the all sources to fullfil SF requirements
>>> 
>>> 2 and 3 work, but with all the upstream source files in the git repo it is
>>> becoming really huge as Yves pointed out.
>>> 
>>> Downloading from upstream servers is slow, more maintenance work, if the
>>> sources are moved on the upstream servers, and sometimes they get lost
>>> elsewhere.
>>> 
>>> Yves made the proposal to create a third git repo (along with the repo
>>> containing the buildtool files and patches, and the repo for Packages
>>> (lrp)).
>>> 
>>> What about committing the upstream sources as *single* files into this
>>> repo
>>> (see Debian sources for example), one can fetch into a local repository
>>> and
>>> buildtool "downloads" the upstream sources from the local repository. We
>>> may even enhance buildtool to  download from the repo directly, if
>>> sources are not available just in case...
>>> 
>>> That way changing upstream sources can be changed easily wizhout dealing
>>> with a 600MB file each time a source is updated.
>>> 
>>> We will have two repos to maintain if a package is updated (buildtool.cfg
>>> in one repo, and the according upstream source in another repo), but we
>>> could achieve all the goals outlined above.
>>> 
>>> Anything wrong with that approach?
>> 
>> Seems ok for me.
>> 
>> What i was thinking is modified a little buildtool to download source tar
>> file in a cache directory, then a hardlink will be created from the sources
>> to the cache (like it is actually with symbolic link from the source to the
>> repo dir for type = symlink (local repo)). When we are doing a buildtool.pl
>> source command, if the file is not in the cache directory it is download
>> (and verified with a checksum), then a hardlink is created is all is ok.
>> With this solution we don’t have to download the same file for multiple
>> architecture and the verification of the checksum is done once (after the
>> download). Using hardlink help a lot to manage the cache directory. If no
>> link point to a file in the cache directory, the file is certainly old and
>> can be remove with a command like buildtool cache cleanup. 
> 

I just made a new version of buildtool that can use a cache directory and use 
checksum to verified that the files has been downloaded correctly.

> Where do I  download new sources locally, if I want to test an upgrade?
> 
> Today I have a local copy of the git repo in within I can play with updates 
> and test if they build etc... If everything is ok, I'l move the changes back 
> to the git repo and commit.
> 
The structure of the cache directory is this:
source/cache/servername/package/

So for example for kernel.org and the linux package i have:
source/cache/kernel.org/linux/

If now we have a new repository on SF (let name it source-files) and the 
configuration is something like this:
<Server sourceforge-sourcefiles>
        Type = gitweb
        Name = sourceforge.net
        Serverpath = p/leaf/sources-files
</Server>
<File linux-__KBRANCH__.tar.xz>
        Server = sourceforge-sourcefiles
        Branch = master
        Directory = linux
        sha1sum = a649d5c15f68ccbae1c4863e357bdc48da4cc0b4
        envname = KERNEL_BASE_SOURCE
</File>
The  linux-__KBRANCH__.tar.xz file will be download to
source/cache/sourceforge-sourcefiles/linux/

And now is you clone the SF source-files repository in 
source/cache/sourceforge-sourcefiles/linux/ you will be in the same position 
like before. You have a local copy of the SF sources-files repo where you can 
put files for test. After all is done commit them directly.

> 
>> About where to
>> get the source tar files, we can use a new repo on SF, but we will need to
>> upload each new version of the tar file to the new SF repository.
> 
>> I think
>> it is easier to download the file from upstream server, and put only our
>> binary files on the new SF repo. In case a file is remove upstream we have
>> the big sources.tgz file (that is made for each release) to recover the tar
>> file and put on the new SF repo.
> 
> Again I know it's easier to maintain a single repo for upstream sources than 
> fetching all the resources from different upstream servers/locations.I know, 
> because, we've done that in the past and it was much less work to move the 
> upstream sources to our repository, and use this repo locally. Much faster 
> builds and no work to keep track of changes regarding downloads on upstream 
> servers. 
> I'm sometimes doing several rebuilds from scratch in a week, and it saved me 
> a 
> lot of time and pain.

I understand. Builtool actually can download from several transport http[s], 
ftp, viewcvs, etc..
I just need to create a new transport (git-archive) to retrieve the files 
directly from SF.

What name can we use for the repository ?

Regards,
Yves

Attachment: smime.p7s
Description: S/MIME cryptographic signature

------------------------------------------------------------------------------
Want fast and easy access to all the code in your enterprise? Index and
search up to 200,000 lines of code with a free copy of Black Duck
Code Sight - the same software that powers the world's largest code
search on Ohloh, the Black Duck Open Hub! Try it now.
http://p.sf.net/sfu/bds
_______________________________________________
leaf-devel mailing list
leaf-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/leaf-devel

Reply via email to