Am Sonntag, 20. Juli 2014, 19:22:18 schrieb Yves Blusseau: > Le 20 juil. 2014 à 19:04, kp kirchdoerfer <kap...@users.sourceforge.net> a écrit : > > Am Sonntag, 20. Juli 2014, 18:40:15 schrieb Yves Blusseau: > >> Le 20 juil. 2014 à 18:22, kp kirchdoerfer <kap...@users.sourceforge.net> > >> a > > > > écrit : > >>> Hi Yves; > >>> > >>> Am Sonntag, 20. Juli 2014, 18:09:00 schrieb Yves Blusseau: > >>>> Hi all, > >>>> > >>>> actually the size of the git repository (i'm speaking of the "database" > >>>> not > >>>> the checkout) is about 3.5GB. It's really too big. The problem is that > >>>> the > >>>> "database" contain ALL the versions of "tar source files". So if > >>>> someone > >>>> need to clone our repository, he need to download at least 3.5GB of > >>>> data. > >>>> > >>>> What i propose is to remove all the external source tar files from the > >>>> history and put them (at least the last one) in another git repository > >>>> on > >>>> SF. The source tar files will be download (if needed) from this new git > >>>> repository (using the http protocol). With this, the bering-uclibc will > >>>> contain only textfiles and some patch, and i think it will take only > >>>> some > >>>> MB. We will continue to create a sources.tgz file when we release a new > >>>> version to meet the requirements of SF. > >>>> > >>>> What do you think about this idea ? If you are ok, i can made the > >>>> process > >>>> to "clean" the repository and update the buildtool.cfg files to change > >>>> the repo from local to SF. > >>> > >>> The way I work is to copy the current repo to a local directory and use > >>> as main server > >>> > >>> <Server localrepo> > >>> > >>> Type = filesymlnk > >>> Serverpath = repo > >>> > >>> </Server> > >>> > >>> Whatever I do in the local directory it won't clash with changes in git > >>> vice versa I will not crash git :) > >>> > >>> So if we do have two repos (one for the sources, one for the buildtool.* > >>> and patches etc) I prefer that buildtool will be able to still use local > >>> directories (in the first place) and only download from SF if sources > >>> are > >>> missing. > >> > >> The simplest is to declare where to get the tar files with something > >> like: > >> <Server sourceforge> > >> > >> type = http > >> Serverpath= xxxx > >> > >> </Server> > >> And the buildtool will download the file only if it's not already > >> download > >> in the repo directory. So if the tar files are already in the repo > >> directory you can build all the lrp offline. > > > > That sounds good. > > > >> For the source.tgz you only > >> have to change the definition of Server sourceforge to <Server > >> sourceforge> > >> > >> Type = filesymlnk > >> Serverpath = repo > >> > >> </Server> > >> because all the files are already in the sources.tgz > > > > Just to understand: > > > > Is "sources.tgz" meant as example like "linux-3.10.47.tar.gz" or do you > > refer to a tgz file that conatins *all* sources in one file? > > About "sources.tgz" i refer to a tgz file that contain "all" sources in one > file.
And how do we maintain the file, if we only upgrade one source like the kernel? Repackage sources.tgz and upload a huge file? I understood your proposal that we maintain the sources in a seperate git repo and use this to package all into one to fit SF requirements. >From this repo we download sources, if they are not in the local directory, otherwise we can build from a local directory/repo. I think about a buildtool logic that checks, if the source is available otherwise it will download from the git repo. kp ------------------------------------------------------------------------------ Want fast and easy access to all the code in your enterprise? Index and search up to 200,000 lines of code with a free copy of Black Duck Code Sight - the same software that powers the world's largest code search on Ohloh, the Black Duck Open Hub! Try it now. http://p.sf.net/sfu/bds _______________________________________________ leaf-devel mailing list leaf-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/leaf-devel