On 11/3/06, Steve Loughran <[EMAIL PROTECTED]> wrote:

Xavier Hanin wrote:

>
> For hosting the repository, if it can  be hosted at apache it's great.
> There's also the solution to host it on svn as we do for the sandbox.

Apache infrastructure were explicit about making the Maven people use an
external host (ibiblio+mirrors), because it hosted non-ASF-licensed
JARs, and because of the load. Now, if its only metadata then maybe it
could be hosted somewhere in apache land.


The problem is that there are some jars on current ivyrep, and some of them
are non apache jars. I don't really know if they are actually used, I should
try to have a look at our apache logs to see. At least a commons vfs jar is
used by Ivy itself, because there is no release of commons vfs, hence we had
to depend on a nightly build we host on ivyrep.

We could always start with a
DNS entry, though of course you could presumably point
ivyrep.jayasoft.org at somewhere new, when the time comes. I'm just
worried about breaking working builds by moving stuff around, which is
why custom DNS hostnames for every SOAP or REST service is a must.


Yes, we will point to the new place when we'll know where we can host it.
But future versions of Ivy will have to use something else as default
repository, to get rid of the jayasoft.org domain.

Anyway, the current ivy repository is rather old, we tried to have the
> policy to add only validated stuff, but it takes too much time and we
> didn't
> manage to get a true community process to validate files.

I'd argue that Maven has a problem here too; they accept stuff without
enough validation, though work is underway to fix it, including some
prolog-based dependency auditing I've been putting together.


Yes, tools could be used to validate at least that the file is correct
with existing dependencies. But to have a tool able to identify that all
dependencies are declared is more difficult. And a tool to know if most use
cases are covered is almost impossible. So it's very difficult to get really
good metadata.

The problem if we
> put non validated files is that:
> - people can't be confident on the repository quality
> - you then have to modify files from time to time, and this is a very
bad
> practice for people already using stuff in the repository.

That brings up one requirement I have, which is that by default <ivy>
should at least check for metadata updates once a week. A local cache is
just that: a cache that can be invalidated from time to time.


I personnaly don't like the fact that metadata can change over time, because
it breaks build reproducibility. But I understand that it's necessary in
some situations. What I think could be a good feature is to be able to give
a Time To Life (by analogy with DNS TTL) for metadata, and be able to define
a default value for this value (which could be one week by default if the
community think it's sensible). This would let people get something in
between the current behaviour which check always / check never.
But I think we could go further with metadata updates, and allow only
changes which guaranty backward compatibility (and thus build
reproducibility). If for example when you update an ivy file you can only
deprecate configurations and introducece new ones, then almost all modules
depending on it wouldn't be impacted by the change without explicitly asking
for it (i.e. dedpending on the new configuration). This is not completly
true (the case where one depend on all configurations using * wildcard is a
problem), but this could be improved with some good design. But once again
I'm going outside the scope of this thread, and this is a subject which
would deserve its own thread.

But I think this could be the object of a discussion on its own, and is
not
> directly related to the repository hosting question.

Yes, and something to discuss with the repository@ people.



Reply via email to