Hi,

Sorry for the lag - though around Christmas, that should be expected :-).

On Mon, Dec 22, 2008 at 08:02, NightStrike <[email protected]> wrote:
> I am currently an admin of a sourceforge project, mingw-w64.  We have
> ported the gcc/binutils toolchain to Win64 to allow usage of a free
> compiler on Win64 platforms.  Most are already familiar with the mingw
> project -- this is similar, and supports both 32-bit and 64-bit
> platforms.
>
> Now, sourceforge is a great service (and hey, it's free), but I
> thought that google code might more align with our project goals.
> However, before doing anything, there are some issues that will
> definitely get in the way, and I wanted to do some feasibility
> analysis before doing anything.
>
> First and foremost is the project size.  We release complete
> toolchains, which range from 150 to 200 MB in size.  We make these
> snapshots daily for 5 different platforms.  That means about 1GB of
> files are added to the file release area every day.  Here is where I
> cannot complain in the slightest about sourceforge, as they do not
> impose any size restrictions on us.  Google will, however, as that is
> a lot of data.

Daily 1G snapshots would definitely be a problem, I think. I'm no
longer working with the project hosting team directly, but afaik the
download area is for permanent downloads: once it's up, you can't take
it down. The download area is designed for major releases that will
have a working URL "for ever", whereas you want transient snapshots
that get garbage collected. You would need to host the snapshots
elsewhere (maybe keep the sourceforge project for that purpose?).

For "normal" releases (eg. version numbers, not snapshots, expected to
be available indefinitely for archival purposes), the file size should
not be a problem. The default project settings would not work, but we
should be able to reconfigure yours to support 200-300M releases, with
a larger overall quota. However, for snapshots, I don't think that
project hosting at Google Code has a solution for you at this time.

- Dave

> To control the data amount, keeping things manageable (and to reduce
> the load on sourceforge), I cull the releases roughly at the two-month
> mark.  I move old releases to an "Old Releases" package, and every so
> often, I go in there and delete files that are just too old to be
> considered at all useful.  So at any given time, I think we have what
> is safe to describe as a "ginormous" amount of data that we are
> providing, and we are constantly deleting old toolchains that are
> frought with errors.
>
> Those two methods of file releases (lots of data and deletion of old
> data) appear to be incompatible with the system that google has set
> up.  So to get this out of the way before even entertaining the
> possibility of using google's googlecode service, are those two issues
> insurmountable?
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Hosting at Google Code" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/google-code-hosting?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to