Re: auto download of antlibs
On 5/10/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Stephen McConnell wrote: > > It seems that there may be two distinct subjects in this thread: > > a) introduction of policy that restricts dynamic resolution of > resource to those available via a local file protocol (refer > Xavier's comments "By offline I mean with no network access") > > b) introduction of a policy that restricts dynamic resolution > of resources to a selection of 'safe'(?) repositories > > The first scenario correctly reflects the offline notion while the second > scenario does not have any relationship to the term. However, the second > scenario does start to recognize that the physical topology of a machine is > not equivalent to the definition of a policy. "offline" is maybe not the correct term. "partitioned" is more accurate. when the internet goes from our site, ibiblio is missing, a local repository is reachable. When the network goes from my laptop, only the localhost and all VMWare hosted machines are available. My laptop may still use ssh and mounted filesystem protocols to see the system, but nothing else. switching on file IO vs. network IO doesnt cut it, because NFS and networked mounted DAV filesystems may be on the wrong side of the partition. Like you say, it depends on network topologies, but I dont want to introduce the concept of partitioned network, as it scares people. Unless we hide it under network "configurations", where different configurations can have different proxy and repository options. That is a more realistic world view of how my laptop acts. Yes, and if we introduce conditional enabling of dependency resolvers in Ivy (disabled would still use cache), this is something that easily be done by users (at least if they manually switched from one network configuration to another). Note that you can already do that in Ivy by switching your settings, but it would be easier with conditional resolver enablement. Xavier Then I can use DNS and WLAN ID analysis to determine the active configuration; this is something best done in C++ than java. -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/
Re: auto download of antlibs
Stephen McConnell wrote: It seems that there may be two distinct subjects in this thread: a) introduction of policy that restricts dynamic resolution of resource to those available via a local file protocol (refer Xavier's comments "By offline I mean with no network access") b) introduction of a policy that restricts dynamic resolution of resources to a selection of 'safe'(?) repositories The first scenario correctly reflects the offline notion while the second scenario does not have any relationship to the term. However, the second scenario does start to recognize that the physical topology of a machine is not equivalent to the definition of a policy. "offline" is maybe not the correct term. "partitioned" is more accurate. when the internet goes from our site, ibiblio is missing, a local repository is reachable. When the network goes from my laptop, only the localhost and all VMWare hosted machines are available. My laptop may still use ssh and mounted filesystem protocols to see the system, but nothing else. switching on file IO vs. network IO doesnt cut it, because NFS and networked mounted DAV filesystems may be on the wrong side of the partition. Like you say, it depends on network topologies, but I dont want to introduce the concept of partitioned network, as it scares people. Unless we hide it under network "configurations", where different configurations can have different proxy and repository options. That is a more realistic world view of how my laptop acts. Then I can use DNS and WLAN ID analysis to determine the active configuration; this is something best done in C++ than java. -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/10/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/9/07, Steve Loughran <[EMAIL PROTECTED]> wrote: teresting. >> >> Maybe every repository in ivyconf.xml would be marked as offline, >> meaning they are available when there is no network.When you run ant (or >> ivy) with -offline, only offline repositories would be used. > > > What do you mean? If all repositories are marked as offline, there is no > added value. We are currently reviewing our cache management, but for the > moment Ivy can already use the cache when you are offline. The problem is > that if you ask for a latest version of something, Ivy will try to connect > to the repository. For the moment this fails if the repository is not > available, but we are planning to make it possible to use cache only in > this > case. But even with this improvement, trying to connect to a non available > repository may take time, so the idea of an offline mode would be to say: > "do not use this repository when offline, use only cache" for repositories > requiring a network connection, and for those which do not require this > connection (like a local repo), continue to use the repo and not only the > cache. > > - xavier I think I mean you could mark a repository (such as an a filesystem, ssh or http repo) as available when a system is offline. When there's an offline build, the stuff in cache is always there, but you'd only hit those repositories marked as available offline. OK, I think I had a similar idea, but thinking about it the other way around: repositories would have a online property, when set to true the repository requires network connection, when false it is available offline. By default an http repository would be considered online, and a filesystem not, but you could change that. But now I tihnk using an offline property would be better understood by users, especially because it matches the idea of the offline mode. the other trick is to hit every HTTP repo with a GET request on startup, but that can be misleading. the ibiblio root is dog slow, and proxy servers can lie, returning an old copy, even when the repository is missing Yes, this is not easy to implement and can be very misleading. But maybe we could see the problem of disabling a repository (as discussed for the offline mode) in a more generic point of view, and be able to enable/disable repositories based on a property value. The property could be ant.build.offline, but it could be anything else. Then the user could pretty easily define his own property when he knows that a particular repository is not available, to disable only one repo. Xavier -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/
RE: auto download of antlibs
> -Original Message- > From: Steve Loughran [mailto:[EMAIL PROTECTED] > Sent: Thursday, 10 May 2007 8:20 PM > To: Ant Developers List > Subject: Re: auto download of antlibs > > Xavier Hanin wrote: > > On 5/9/07, Steve Loughran <[EMAIL PROTECTED]> wrote: > teresting. > >> > >> Maybe every repository in ivyconf.xml would be marked as offline, > >> meaning they are available when there is no network.When > you run ant > >> (or > >> ivy) with -offline, only offline repositories would be used. > > > > > > What do you mean? If all repositories are marked as > offline, there is > > no added value. We are currently reviewing our cache > management, but > > for the moment Ivy can already use the cache when you are > offline. The > > problem is that if you ask for a latest version of > something, Ivy will > > try to connect to the repository. For the moment this fails if the > > repository is not available, but we are planning to make it > possible > > to use cache only in this case. But even with this > improvement, trying > > to connect to a non available repository may take time, so > the idea of > > an offline mode would be to say: > > "do not use this repository when offline, use only cache" for > > repositories requiring a network connection, and for those which do > > not require this connection (like a local repo), continue > to use the > > repo and not only the cache. > > > > - xavier > > I think I mean you could mark a repository (such as an a > filesystem, ssh or http repo) as available when a system is > offline. When there's an offline build, the stuff in cache is > always there, but you'd only hit those repositories marked as > available offline. It seems that there may be two distinct subjects in this thread: a) introduction of policy that restricts dynamic resolution of resource to those available via a local file protocol (refer Xavier's comments "By offline I mean with no network access") b) introduction of a policy that restricts dynamic resolution of resources to a selection of 'safe'(?) repositories The first scenario correctly reflects the offline notion while the second scenario does not have any relationship to the term. However, the second scenario does start to recognize that the physical topology of a machine is not equivalent to the definition of a policy. Cheers, Steve. -- Stephen McConnell mailto:[EMAIL PROTECTED] http://www.dpml.net - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Xavier Hanin wrote: On 5/9/07, Steve Loughran <[EMAIL PROTECTED]> wrote: teresting. Maybe every repository in ivyconf.xml would be marked as offline, meaning they are available when there is no network.When you run ant (or ivy) with -offline, only offline repositories would be used. What do you mean? If all repositories are marked as offline, there is no added value. We are currently reviewing our cache management, but for the moment Ivy can already use the cache when you are offline. The problem is that if you ask for a latest version of something, Ivy will try to connect to the repository. For the moment this fails if the repository is not available, but we are planning to make it possible to use cache only in this case. But even with this improvement, trying to connect to a non available repository may take time, so the idea of an offline mode would be to say: "do not use this repository when offline, use only cache" for repositories requiring a network connection, and for those which do not require this connection (like a local repo), continue to use the repo and not only the cache. - xavier I think I mean you could mark a repository (such as an a filesystem, ssh or http repo) as available when a system is offline. When there's an offline build, the stuff in cache is always there, but you'd only hit those repositories marked as available offline. the other trick is to hit every HTTP repo with a GET request on startup, but that can be misleading. the ibiblio root is dog slow, and proxy servers can lie, returning an old copy, even when the repository is missing -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/9/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/8/07, Steve Loughran <[EMAIL PROTECTED]> wrote: >> Xavier Hanin wrote: >> > On 5/7/07, Steve Loughran <[EMAIL PROTECTED]> wrote: >> >> >> hooking in to a named ivy conf: >> >> >> >> >> > And wher is the version information? And how do we map this package >> > name to an organization/module name couple? What do you think of >> > providing all information: >> > > > org="org.example" >> > module="example" >> > rev="1.3" >> > conf="example" /> >> >> I'd expect all version info to be in ivy.xml; when I declare a >> configuration in the declaration, I say which ivy configuration >> I want, without any version info embedded in the build files > And where is the ivy.xml? > alongside the build.xml, of course. I'm assuming that has already kicked in. So you have to call ivy:resolve manually if you use multiple antlibs (to name the ivy.xml files). I'm not sure having to use a separate ivy.xml for each antlib make much sense, since you will usually have only one dependency in your ivy.xml. I think the inline mode (where you specify the dependency when calling the task, with no ivy file) should at least be an option. If we really want to put this information out of the antlib task, I think it doesn't really that different from what is possible today: Am I missing something? Xavier - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/
Re: auto download of antlibs
On 5/9/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/7/07, Stephen McConnell <[EMAIL PROTECTED]> wrote: >> >> >> > -Original Message- >> > From: Xavier Hanin [mailto:[EMAIL PROTECTED] >> > Sent: Friday, 4 May 2007 5:56 PM >> > To: Ant Developers List >> > Subject: Re: auto download of antlibs >> > >> > On 5/4/07, Steve Loughran <[EMAIL PROTECTED]> wrote: >> > > >> > > One thing I've been thinking of this week is how could we work with >> > > Ivy for automatic antlib download. >> > > >> > > No code right now, just some thoughts >> > > >> > > >> > > 1. add a -offline argument to say "we are offline". this will set >> > > a property, (and.build.offline) and the test will >> > > work. It is meant to tell things like Ivy that we are offline. At >> > > some point we could add some way for Ant to guess whether the net >> > > is there or not, if java integrates with the OS properly (there is >> > > an API call for this in J2ME, just not Java SE) >> > >> > This makes me think that we could improve how Ivy deal with >> > online/offline mode. Indeed for the moment Ivy doesn't really >> > know which repository requires a network access and which >> > doesn't. It would be nice if Ivy would know that, so that >> > even if offline mode you could still use alocal repository >> > (for antlib testing for instance). >> >> Are you describing a policy at the level of: >> >> a) a multi-project build decision? >> b) a specific target project build decision? >> c) a repository access decision? >> d) some or any of the above? > I'm describing how Ivy could be improved to better deal with a > situation where the user do not have network access. You can already > deal with that pretty efficiently in Ivy, but you have to define > several settings, to avoid using remote access when it's not > available. Having something easier for the user in this common > situation would be interesting. Maybe every repository in ivyconf.xml would be marked as offline, meaning they are available when there is no network.When you run ant (or ivy) with -offline, only offline repositories would be used. What do you mean? If all repositories are marked as offline, there is no added value. We are currently reviewing our cache management, but for the moment Ivy can already use the cache when you are offline. The problem is that if you ask for a latest version of something, Ivy will try to connect to the repository. For the moment this fails if the repository is not available, but we are planning to make it possible to use cache only in this case. But even with this improvement, trying to connect to a non available repository may take time, so the idea of an offline mode would be to say: "do not use this repository when offline, use only cache" for repositories requiring a network connection, and for those which do not require this connection (like a local repo), continue to use the repo and not only the cache. - xavier - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/
Re: auto download of antlibs
Stefan Bodewig wrote: On Mon, 07 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: You'd have to include a version. One thing you could do is lib:xmlns="antlib://org/example/something#2.13" ...but that would place the version into the namespace, which is too early to read in/expand ant properties, and you'd have to update the xmlns declaration everywhere you used it...that's a no-no in a big project. True. Apart from the idea to use the not-exactly-automatic approach you describe (use a typedef instead of a namespace alone) another option would be a level of indirection. Something Xavier suggested last week. xmlns="antlib:ivy://org.apache.antlibs/antunit#integration" could trigger a lookup of an ivy.xml file and we'll use the integration configuration of the artifact antunit in the org.apache.antlibs organization (making up names here). oh, so the ivy.xml file (or ivyconf.xml for big-projects), would list the mappings. that's a nice idea. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Xavier Hanin wrote: On 5/8/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/7/07, Steve Loughran <[EMAIL PROTECTED]> wrote: >> hooking in to a named ivy conf: >> >> > And wher is the version information? And how do we map this package > name to an organization/module name couple? What do you think of > providing all information: > org="org.example" > module="example" > rev="1.3" > conf="example" /> I'd expect all version info to be in ivy.xml; when I declare a configuration in the declaration, I say which ivy configuration I want, without any version info embedded in the build files And where is the ivy.xml? alongside the build.xml, of course. I'm assuming that has already kicked in. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Xavier Hanin wrote: On 5/7/07, Stephen McConnell <[EMAIL PROTECTED]> wrote: > -Original Message- > From: Xavier Hanin [mailto:[EMAIL PROTECTED] > Sent: Friday, 4 May 2007 5:56 PM > To: Ant Developers List > Subject: Re: auto download of antlibs > > On 5/4/07, Steve Loughran <[EMAIL PROTECTED]> wrote: > > > > One thing I've been thinking of this week is how could we work with > > Ivy for automatic antlib download. > > > > No code right now, just some thoughts > > > > > > 1. add a -offline argument to say "we are offline". this will set > > a property, (and.build.offline) and the test will > > work. It is meant to tell things like Ivy that we are offline. At > > some point we could add some way for Ant to guess whether the net > > is there or not, if java integrates with the OS properly (there is > > an API call for this in J2ME, just not Java SE) > > This makes me think that we could improve how Ivy deal with > online/offline mode. Indeed for the moment Ivy doesn't really > know which repository requires a network access and which > doesn't. It would be nice if Ivy would know that, so that > even if offline mode you could still use alocal repository > (for antlib testing for instance). Are you describing a policy at the level of: a) a multi-project build decision? b) a specific target project build decision? c) a repository access decision? d) some or any of the above? I'm describing how Ivy could be improved to better deal with a situation where the user do not have network access. You can already deal with that pretty efficiently in Ivy, but you have to define several settings, to avoid using remote access when it's not available. Having something easier for the user in this common situation would be interesting. Maybe every repository in ivyconf.xml would be marked as offline, meaning they are available when there is no network.When you run ant (or ivy) with -offline, only offline repositories would be used. - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On Mon, 07 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: > You'd have to include a version. One thing you could do is > lib:xmlns="antlib://org/example/something#2.13" ...but that would > place the version into the namespace, which is too early to read > in/expand ant properties, and you'd have to update the xmlns > declaration everywhere you used it...that's a no-no in a big > project. True. Apart from the idea to use the not-exactly-automatic approach you describe (use a typedef instead of a namespace alone) another option would be a level of indirection. Something Xavier suggested last week. xmlns="antlib:ivy://org.apache.antlibs/antunit#integration" could trigger a lookup of an ivy.xml file and we'll use the integration configuration of the artifact antunit in the org.apache.antlibs organization (making up names here). Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/8/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/7/07, Steve Loughran <[EMAIL PROTECTED]> wrote: >> hooking in to a named ivy conf: >> >> > And wher is the version information? And how do we map this package > name to an organization/module name couple? What do you think of > providing all information: > org="org.example" > module="example" > rev="1.3" > conf="example" /> I'd expect all version info to be in ivy.xml; when I declare a configuration in the declaration, I say which ivy configuration I want, without any version info embedded in the build files And where is the ivy.xml? >> >> The trick here would be to make it a no-op if there was already an >> antlib defined into the namespace. > Yes, would be a nice trick. that's what we would have to add above what is there today. >> I'm also thinking of an resource that let's you declare a >> path inline >> >> > Is it a resource or a resource collection? I'm not familiar with the > Resource API yet... > Moreover, where do the module information (org/module/rev) come from? > Shouldn't we provide them? As a side note, it's very similar to the > current ivy:cachepath task. The main difference is that ivy:cachepath > is a task, not a resource. But to be a resource I think we'd need some > kind of lifecycle management for resources. > class Resource extends ResourceCollection :) I do think a resource version of cachepath is exactly what we want. We dont need a lifecycle for resources either, provided the resource can track whether it has resolved (or failed to resolve) yet. it just does a resolution the first time its needed (this is how filepaths work) This shouldn't be too difficult to handle. The most difficult part for us is that this is specific to ant 1.7, so we will have a part of our code base specific to 1.7, and the rest compatible with ant 1.6, so we will have to be very careful not to introduce ant 1.7 dependency in the rest of the code base. - Xavier -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/ - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/7/07, Stephen McConnell <[EMAIL PROTECTED]> wrote: > -Original Message- > From: Xavier Hanin [mailto:[EMAIL PROTECTED] > Sent: Friday, 4 May 2007 5:56 PM > To: Ant Developers List > Subject: Re: auto download of antlibs > > On 5/4/07, Steve Loughran <[EMAIL PROTECTED]> wrote: > > > > One thing I've been thinking of this week is how could we work with > > Ivy for automatic antlib download. > > > > No code right now, just some thoughts > > > > > > 1. add a -offline argument to say "we are offline". this will set > > a property, (and.build.offline) and the test will > > work. It is meant to tell things like Ivy that we are offline. At > > some point we could add some way for Ant to guess whether the net > > is there or not, if java integrates with the OS properly (there is > > an API call for this in J2ME, just not Java SE) > > This makes me think that we could improve how Ivy deal with > online/offline mode. Indeed for the moment Ivy doesn't really > know which repository requires a network access and which > doesn't. It would be nice if Ivy would know that, so that > even if offline mode you could still use alocal repository > (for antlib testing for instance). Are you describing a policy at the level of: a) a multi-project build decision? b) a specific target project build decision? c) a repository access decision? d) some or any of the above? I'm describing how Ivy could be improved to better deal with a situation where the user do not have network access. You can already deal with that pretty efficiently in Ivy, but you have to define several settings, to avoid using remote access when it's not available. Having something easier for the user in this common situation would be interesting. > > > > > 2. when we encounter an element (or even an attr) in an > > unknown antlib xmlns, and we want to map that to a > > projectcomponent, we hand off resolution to an antlib > > resolver. We would have one built in (the failing resolver), > > would default to the ivy one if it was present, and > > provide some way to let people switch to a different one. > > This sounds like a good idea. > > > > > 3. an antlib resolver would do the mapping from antlib package to > > artifacts (problem one), > Yes, and note that we have to consider the version too. If you assume you are keying of a url, then no .. In such a scenario you can bring thing back to the url protocol handler and delegate the problem to the handler. For example I may want to assert any of the following: a) a specific version artifact b) the latest version of an artifact c) an artifact with a versioned constraint range Using Metro/Depot/Transit the following may be equivalent: artifact:jar:org/apache/ant/ant#1.7 link:jar:org/apache/ant/ant The first url references an absolute version, the second is like a symlink (typically referencing the latest version). Fine, so you consider the version. > > > then download the metadata for that artifact, pull it down > > and all its artifacts > > > > 4. we would then the lib with the classpath that > > is set up by the resolver > > One question here: is it the responsibility of the resolver > to keep artifacts in a cache, or put artifacts in an Ant > managed cache. Isn't that an implementation decision? I don't think so. If Ant provides a pluggable mechanism for automatic download of antlibs, it will have to define if once the files are downloaded they are under Ant control, or stay under control of the plugin. > This is important to specify how things are > going in offline mode. Ivy already has pretty good support > for offline mode, and I think we could improve it (see > above). But this is important to consider when specifying the > role of the antlib resolver. What do you mean by offline? By offline I mean with no network access. Typically this subject is about policy on resource resolution which is not simply a question of establishing a remote connection. Are we making assumptions about cache content? Is the cache a trusted repository? > > > > we'd need a metadata tree mapping antlibs to well known > packages, but > > that is not too hard. JSON, perhaps. > Not too hard, except maybe for version information. I'm not > sure that it would be really nice to get the latest version > by default, making the build system automatically updated is > not necessarily a good idea (at least users have to keep very > good control over that). Yep - basically your describing the policy you want to apply with respect to artifact resolution. If its absolute versioning are you assuming Dewey versioning? If its la
Re: auto download of antlibs
Xavier Hanin wrote: On 5/7/07, Steve Loughran <[EMAIL PROTECTED]> wrote: hooking in to a named ivy conf: And wher is the version information? And how do we map this package name to an organization/module name couple? What do you think of providing all information: I'd expect all version info to be in ivy.xml; when I declare a configuration in the declaration, I say which ivy configuration I want, without any version info embedded in the build files The trick here would be to make it a no-op if there was already an antlib defined into the namespace. Yes, would be a nice trick. that's what we would have to add above what is there today. I'm also thinking of an resource that let's you declare a path inline Is it a resource or a resource collection? I'm not familiar with the Resource API yet... Moreover, where do the module information (org/module/rev) come from? Shouldn't we provide them? As a side note, it's very similar to the current ivy:cachepath task. The main difference is that ivy:cachepath is a task, not a resource. But to be a resource I think we'd need some kind of lifecycle management for resources. class Resource extends ResourceCollection :) I do think a resource version of cachepath is exactly what we want. We dont need a lifecycle for resources either, provided the resource can track whether it has resolved (or failed to resolve) yet. it just does a resolution the first time its needed (this is how filepaths work) -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Stephen McConnell wrote: -Original Message- From: Stefan Bodewig [mailto:[EMAIL PROTECTED] 3. an antlib resolver would do the mapping from antlib package to artifacts (problem one), actually a pretty big problem. Which may explain why two JSRs are dealing with the subject: http://jcp.org/en/jsr/detail?id=294 http://jcp.org/en/jsr/detail?id=277 gosh, two! aren't we lucky! -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: auto download of antlibs
> -Original Message- > From: Stefan Bodewig [mailto:[EMAIL PROTECTED] > > 3. an antlib resolver would do the mapping from antlib package to > > artifacts (problem one), > > actually a pretty big problem. Which may explain why two JSRs are dealing with the subject: http://jcp.org/en/jsr/detail?id=294 http://jcp.org/en/jsr/detail?id=277 Cheers, Steve. -- Stephen McConnell mailto:[EMAIL PROTECTED] http://www.dpml.net - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: auto download of antlibs
> -Original Message- > From: Xavier Hanin [mailto:[EMAIL PROTECTED] > Sent: Friday, 4 May 2007 5:56 PM > To: Ant Developers List > Subject: Re: auto download of antlibs > > On 5/4/07, Steve Loughran <[EMAIL PROTECTED]> wrote: > > > > One thing I've been thinking of this week is how could we work with > > Ivy for automatic antlib download. > > > > No code right now, just some thoughts > > > > > > 1. add a -offline argument to say "we are offline". this will set > > a property, (and.build.offline) and the test will > > work. It is meant to tell things like Ivy that we are offline. At > > some point we could add some way for Ant to guess whether the net > > is there or not, if java integrates with the OS properly (there is > > an API call for this in J2ME, just not Java SE) > > This makes me think that we could improve how Ivy deal with > online/offline mode. Indeed for the moment Ivy doesn't really > know which repository requires a network access and which > doesn't. It would be nice if Ivy would know that, so that > even if offline mode you could still use alocal repository > (for antlib testing for instance). Are you describing a policy at the level of: a) a multi-project build decision? b) a specific target project build decision? c) a repository access decision? d) some or any of the above? > > > > > 2. when we encounter an element (or even an attr) in an > > unknown antlib xmlns, and we want to map that to a > > projectcomponent, we hand off resolution to an antlib > > resolver. We would have one built in (the failing resolver), > > would default to the ivy one if it was present, and > > provide some way to let people switch to a different one. > > This sounds like a good idea. > > > > > 3. an antlib resolver would do the mapping from antlib package to > > artifacts (problem one), > Yes, and note that we have to consider the version too. If you assume you are keying of a url, then no .. In such a scenario you can bring thing back to the url protocol handler and delegate the problem to the handler. For example I may want to assert any of the following: a) a specific version artifact b) the latest version of an artifact c) an artifact with a versioned constraint range Using Metro/Depot/Transit the following may be equivalent: artifact:jar:org/apache/ant/ant#1.7 link:jar:org/apache/ant/ant The first url references an absolute version, the second is like a symlink (typically referencing the latest version). > > > then download the metadata for that artifact, pull it down > > and all its artifacts > > > > 4. we would then the lib with the classpath that > > is set up by the resolver > > One question here: is it the responsibility of the resolver > to keep artifacts in a cache, or put artifacts in an Ant > managed cache. Isn't that an implementation decision? > This is important to specify how things are > going in offline mode. Ivy already has pretty good support > for offline mode, and I think we could improve it (see > above). But this is important to consider when specifying the > role of the antlib resolver. What do you mean by offline? Typically this subject is about policy on resource resolution which is not simply a question of establishing a remote connection. Are we making assumptions about cache content? Is the cache a trusted repository? > > > > we'd need a metadata tree mapping antlibs to well known > packages, but > > that is not too hard. JSON, perhaps. > Not too hard, except maybe for version information. I'm not > sure that it would be really nice to get the latest version > by default, making the build system automatically updated is > not necessarily a good idea (at least users have to keep very > good control over that). Yep - basically your describing the policy you want to apply with respect to artifact resolution. If its absolute versioning are you assuming Dewey versioning? If its latest do you mean latest build or latest stable build or latest released build? Cheers, Steve. -- Stephen McConnell mailto:[EMAIL PROTECTED] http://www.dpml.net - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: auto download of antlibs
> -Original Message- > From: Steve Loughran [mailto:[EMAIL PROTECTED] > Sent: Friday, 4 May 2007 5:27 PM > To: Ant Developers List > Subject: auto download of antlibs > > > One thing I've been thinking of this week is how could we > work with Ivy for automatic antlib download. > > No code right now, just some thoughts > > > 1. add a -offline argument to say "we are offline". this will > set a property, (and.build.offline) and the test > will work. It is meant to tell things like Ivy that we are > offline. At some point we could add some way for Ant to guess > whether the net is there or not, if java integrates with the > OS properly (there is an API call for this in J2ME, just not Java SE) Sounds like we are mixing a characteristic of a network connection with a policy decision. When we talk about being offline - we are normally describing a situation under which a TCP/IP connection is unavailable. When developers discuss '-offline' as a policy what this often translates to is that they want to assert a rule preventing the automatic downloading of artifacts. Recognizing this difference enables recognition of a bunch of other possibilities: a) I have a artifacts that I depend upon I want to modify the logic used in the resolution of said artifacts - do I want to resolve artifacts over a remote network connection? - are internet resolvable connections ok? - am I really talking about shades of gray relative to a collection of repositories - in effect, am I designing a artifact retrieval policy - am I talking about trust? - am I talking about artifact integrity? b) What is the state of my cache? - was the cached artifact established with the same policy as the policy I'm currently asserting - does my cache management system associate existence policy with the artifact - it the cached object verifiable - does my build policy imply anything on my caching policy - is my cache sharable and if it is, what am I asserting in terms of policy c) What is the relationship between build process, cache, and shared repositories? - am I trusted? - how can clients validate me, my cache, my policy, my artifacts - does my build process trust my cache (given that interim dependent builds may be using policies that are not under my control - e.g. Eric uses Antlib X which has dependencies on jar X, Y, and Z > 2. when we encounter an element (or even an attr) in an > unknown antlib xmlns, and we want to map that to a > projectcomponent, we hand off resolution to an antlib > resolver. We would have one built in (the failing resolver), > would default to the ivy one if it was present, and provide > some way to let people switch to a different one. You can do this without mentioning Ivy so long as you have the mechanisms to include URL protocol handlers. Example of a working build.xml file: The above build works if I put around about 5 specialized jar files into my ./ant/lib directory and invoke: $ ant Or more typically, the mechanism I use on more than one hundred Ant based projects (without anything in .ant/lib): $ build In both cases what I am doing is making Ant URL aware - as such "local:template:dpml/tools/standard" is recognized as a protocol handler and the handler recognizes content types and maps into place the appropriate content handler which in this case simply drags in template build file. The template file contains the following statements: ... The task establishes an project helper that deals with uris for things like antlib plugins (and a bunch of other protocol handlers that let me deal with cached resources, resources on remote hosts, local preferences, services based on independent virtual machines, deployment scenarios for local or remote applications, basically most of the things you need in a fully functional build environment. In effect - it basically does the setup of the machinery needed to override Ant behaviour when resolving tasks and data types using the URL machinery bundled in the JVM. > 3. an antlib resolver would do the mapping from antlib > package to artifacts (problem one), then download the > metadata for that artifact, pull it down and all its artifacts Sounds like a protocol handler that captures sufficient information to represent a classloader chain together with som information about the deployment target. One example that approaches this is the DPML part definition which encapsulates (a) generic info, (b) a deployment strategy, and (c) a classloader chain definition. http://www.dpml.net/metro/parts/index.html In the DPML model the deployment strategy is dynamic and in out environment we have several strategies we use on a regular basis. One of these is a antlib strategy which simply identifies the path to the antlib resource and the namesp
Re: auto download of antlibs
On 5/7/07, Steve Loughran <[EMAIL PROTECTED]> wrote: Xavier Hanin wrote: > On 5/4/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: >> >> > we'd need a metadata tree mapping antlibs to well known packages, >> >> > but that is not too hard. JSON, perhaps. >> >> >> >> Not sure. Who'd maintain it? >> > >> >It should be some xml format. >> >I think that it should be on the ant site >> >and ant committers would be the updaters of it. >> >- this would be similar to the >> >related projects page - http://ant.apache.org/projects.html >> >but have a separate url for each antlib. >> >? somthing like: http://ant.apache.org/antlibdefintions/.xml >> >for example: >> >http://ant.apache.org/antlibdefintions/net.sf.antcontrib.xml >> > >> >of course this raises the issues of version. One may not want >> >the lastest >> >version of a particular antlib. >> >> >> There is a solution for versioning issues ... or doesnt solve a >> Maven-repo >> versioning of multiple formats? > mm, the problem is not to store multiple versions on the repo, but to > know which one to pick from the antlib URI. As far as I understand > Steve proposal, the idea would be to introduce automatic download > based on the current format of antlib declaration, which only contains > a package, and no version information. > You'd have to include a version. One thing you could do is lib:xmlns="antlib://org/example/something#2.13" ...but that would place the version into the namespace, which is too early to read in/expand ant properties, and you'd have to update the xmlns declaration everywhere you used it...that's a no-no in a big project. Indeed, good point. there's also the issue of setting up your ivy conf before the build. Now unless we want to be maven-style and look for properties in undocumented propertlies like ant.antlib.org.example.something.version and secretly extract the version info from there, we need an explicit declaration of versions. Also there's the security issue. Good point too. I've been thinking more what we could do with tasks rather than fully automated download. As a first pass, you could combine an ivy download with a typedef, hooking in to a named ivy conf: And wher is the version information? And how do we map this package name to an organization/module name couple? What do you think of providing all information: The trick here would be to make it a no-op if there was already an antlib defined into the namespace. Yes, would be a nice trick. [speaking of which, is there a way of enumerating all currently declaraed antlibs?] I'm also thinking of an resource that let's you declare a path inline Is it a resource or a resource collection? I'm not familiar with the Resource API yet... Moreover, where do the module information (org/module/rev) come from? Shouldn't we provide them? As a side note, it's very similar to the current ivy:cachepath task. The main difference is that ivy:cachepath is a task, not a resource. But to be a resource I think we'd need some kind of lifecycle management for resources. Xavier -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Xavier Hanin - Independent Java Consultant Manage your dependencies with Ivy! http://incubator.apache.org/ivy/ - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Xavier Hanin wrote: On 5/4/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: >> > we'd need a metadata tree mapping antlibs to well known packages, >> > but that is not too hard. JSON, perhaps. >> >> Not sure. Who'd maintain it? > >It should be some xml format. >I think that it should be on the ant site >and ant committers would be the updaters of it. >- this would be similar to the >related projects page - http://ant.apache.org/projects.html >but have a separate url for each antlib. >? somthing like: http://ant.apache.org/antlibdefintions/.xml >for example: >http://ant.apache.org/antlibdefintions/net.sf.antcontrib.xml > >of course this raises the issues of version. One may not want >the lastest >version of a particular antlib. There is a solution for versioning issues ... or doesnt solve a Maven-repo versioning of multiple formats? mm, the problem is not to store multiple versions on the repo, but to know which one to pick from the antlib URI. As far as I understand Steve proposal, the idea would be to introduce automatic download based on the current format of antlib declaration, which only contains a package, and no version information. You'd have to include a version. One thing you could do is lib:xmlns="antlib://org/example/something#2.13" ...but that would place the version into the namespace, which is too early to read in/expand ant properties, and you'd have to update the xmlns declaration everywhere you used it...that's a no-no in a big project. there's also the issue of setting up your ivy conf before the build. Now unless we want to be maven-style and look for properties in undocumented propertlies like ant.antlib.org.example.something.version and secretly extract the version info from there, we need an explicit declaration of versions. Also there's the security issue. I've been thinking more what we could do with tasks rather than fully automated download. As a first pass, you could combine an ivy download with a typedef, hooking in to a named ivy conf: The trick here would be to make it a no-op if there was already an antlib defined into the namespace. [speaking of which, is there a way of enumerating all currently declaraed antlibs?] I'm also thinking of an resource that let's you declare a path inline -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On Fri, 4 May 2007, Xavier Hanin <[EMAIL PROTECTED]> wrote: > On 5/4/07, Stefan Bodewig <[EMAIL PROTECTED]> wrote: >> On Fri, 04 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: >> > we'd need a metadata tree mapping antlibs to well known packages, >> > but that is not too hard. JSON, perhaps. >> >> Not sure. Who'd maintain it? Maintaining it for our own Antlibs >> is easy, but we wouldn't want the mechanism to only apply for them. >> And I'd be scared of the security implications of a Wiki driven >> list or something even close to that. > > You make a good point. So maybe this would require all information > (module identifier and version) to be in the antlib URL, thus > requiring another antlib url format (maybe with a distinct > protocol), which is not really going in the same direction as you > suggested, steve. At least that would allow us to live without a central URI -> antlib artifact mapping, I'd prefer that. > Another option from the top of my head: build a module identifier > from the package name, even if it's not very accurate, the only > purpose is to get something unique. It could something like: org = > package name; module = last part of the package name eg: > org.apache.ivy.ant => org = org.apache.ivy.ant; module = ant This > module would not be the antlib module, but only a module with its > only artifact being metadata about the module containing the actual > antlib. This metadata could be in a simple format, JSON, XML or > properties file. Then we can use this metadata to actually download > the antlib. The remaining problem is version information. That would require an extra level of indirection. I think I'm more leaning towards a different URI scheme that encoded all information that we need (including version). Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On Fri, 4 May 2007, Peter Reilly <[EMAIL PROTECTED]> wrote: > On 5/4/07, Stefan Bodewig <[EMAIL PROTECTED]> wrote: >> On Fri, 04 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: >> > we'd need a metadata tree mapping antlibs to well known packages, >> > but that is not too hard. JSON, perhaps. >> >> Not sure. Who'd maintain it? > > It should be some xml format. I think that it should be on the ant > site and ant committers would be the updaters of it. - this would > be similar to the related projects page - I know the projects page, and I know that sometimes we are a bit sloppy in updating it, even if people send in patches. At one point we started to recommend that people use the Wiki instead. Using the Wiki is fine as long as the content is processed by humans, but it is unusable for automated processing. > of course this raises the issues of version. One may not want the > lastest version of a particular antlib. Could be a fragment identifier or an appended XPath or something like that antlib:org.apache.ant.antunit#1.1 Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/4/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: >> > we'd need a metadata tree mapping antlibs to well known packages, >> > but that is not too hard. JSON, perhaps. >> >> Not sure. Who'd maintain it? > >It should be some xml format. >I think that it should be on the ant site >and ant committers would be the updaters of it. >- this would be similar to the >related projects page - http://ant.apache.org/projects.html >but have a separate url for each antlib. >? somthing like: http://ant.apache.org/antlibdefintions/.xml >for example: >http://ant.apache.org/antlibdefintions/net.sf.antcontrib.xml > >of course this raises the issues of version. One may not want >the lastest >version of a particular antlib. There is a solution for versioning issues ... or doesnt solve a Maven-repo versioning of multiple formats? mm, the problem is not to store multiple versions on the repo, but to know which one to pick from the antlib URI. As far as I understand Steve proposal, the idea would be to introduce automatic download based on the current format of antlib declaration, which only contains a package, and no version information. Am I wrong? Xavier Or we provide a webapp ... http://ant.apache.org/antlibs?uri=org.apache.ant.antunit&version=2 Jan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Learn Ivy at ApacheCon: http://www.eu.apachecon.com/ Manage your dependencies with Ivy! http://incubator.apache.org/ivy/ - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/4/07, Stefan Bodewig <[EMAIL PROTECTED]> wrote: On Fri, 04 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: > 1. add a -offline argument to say "we are offline". this will set a > property, (and.build.offline) and the test will work. Do I sense oata.utils.NetworkUtils? Might contain some Proxy configuration (and if possible detection) code as well. > 2. when we encounter an element (or even an attr) in an unknown > antlib xmlns, and we want to map that to a projectcomponent, we hand > off resolution to an antlib resolver. We would have one built in > (the failing resolver), would default to the ivy one if it was > present, and provide some way to let people switch to a different > one. OK. > 3. an antlib resolver would do the mapping from antlib package to > artifacts (problem one), actually a pretty big problem. > then download the metadata for that artifact, pull it down and all > its artifacts > > 4. we would then the lib with the classpath that is set up > by the resolver sounds right. > we'd need a metadata tree mapping antlibs to well known packages, > but that is not too hard. JSON, perhaps. Not sure. Who'd maintain it? It should be some xml format. I think that it should be on the ant site and ant committers would be the updaters of it. - this would be similar to the related projects page - http://ant.apache.org/projects.html but have a separate url for each antlib. ? somthing like: http://ant.apache.org/antlibdefintions/.xml for example: http://ant.apache.org/antlibdefintions/net.sf.antcontrib.xml of course this raises the issues of version. One may not want the lastest version of a particular antlib. Peter Maintaining it for our own Antlibs is easy, but we wouldn't want the mechanism to only apply for them. And I'd be scared of the security implications of a Wiki driven list or something even close to that. Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/4/07, Stefan Bodewig <[EMAIL PROTECTED]> wrote: On Fri, 04 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: > 1. add a -offline argument to say "we are offline". this will set a > property, (and.build.offline) and the test will work. Do I sense oata.utils.NetworkUtils? Might contain some Proxy configuration (and if possible detection) code as well. > 2. when we encounter an element (or even an attr) in an unknown > antlib xmlns, and we want to map that to a projectcomponent, we hand > off resolution to an antlib resolver. We would have one built in > (the failing resolver), would default to the ivy one if it was > present, and provide some way to let people switch to a different > one. OK. > 3. an antlib resolver would do the mapping from antlib package to > artifacts (problem one), actually a pretty big problem. > then download the metadata for that artifact, pull it down and all > its artifacts > > 4. we would then the lib with the classpath that is set up > by the resolver sounds right. > we'd need a metadata tree mapping antlibs to well known packages, > but that is not too hard. JSON, perhaps. Not sure. Who'd maintain it? Maintaining it for our own Antlibs is easy, but we wouldn't want the mechanism to only apply for them. And I'd be scared of the security implications of a Wiki driven list or something even close to that. You make a good point. So maybe this would require all information (module identifier and version) to be in the antlib URL, thus requiring another antlib url format (maybe with a distinct protocol), which is not really going in the same direction as you suggested, steve. Another option from the top of my head: build a module identifier from the package name, even if it's not very accurate, the only purpose is to get something unique. It could something like: org = package name; module = last part of the package name eg: org.apache.ivy.ant => org = org.apache.ivy.ant; module = ant This module would not be the antlib module, but only a module with its only artifact being metadata about the module containing the actual antlib. This metadata could be in a simple format, JSON, XML or properties file. Then we can use this metadata to actually download the antlib. The remaining problem is version information. Xavier Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Learn Ivy at ApacheCon: http://www.eu.apachecon.com/ Manage your dependencies with Ivy! http://incubator.apache.org/ivy/ - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On Fri, 04 May 2007, Steve Loughran <[EMAIL PROTECTED]> wrote: > 1. add a -offline argument to say "we are offline". this will set a > property, (and.build.offline) and the test will work. Do I sense oata.utils.NetworkUtils? Might contain some Proxy configuration (and if possible detection) code as well. > 2. when we encounter an element (or even an attr) in an unknown > antlib xmlns, and we want to map that to a projectcomponent, we hand > off resolution to an antlib resolver. We would have one built in > (the failing resolver), would default to the ivy one if it was > present, and provide some way to let people switch to a different > one. OK. > 3. an antlib resolver would do the mapping from antlib package to > artifacts (problem one), actually a pretty big problem. > then download the metadata for that artifact, pull it down and all > its artifacts > > 4. we would then the lib with the classpath that is set up > by the resolver sounds right. > we'd need a metadata tree mapping antlibs to well known packages, > but that is not too hard. JSON, perhaps. Not sure. Who'd maintain it? Maintaining it for our own Antlibs is easy, but we wouldn't want the mechanism to only apply for them. And I'd be scared of the security implications of a Wiki driven list or something even close to that. Stefan - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
On 5/4/07, Steve Loughran <[EMAIL PROTECTED]> wrote: One thing I've been thinking of this week is how could we work with Ivy for automatic antlib download. No code right now, just some thoughts 1. add a -offline argument to say "we are offline". this will set a property, (and.build.offline) and the test will work. It is meant to tell things like Ivy that we are offline. At some point we could add some way for Ant to guess whether the net is there or not, if java integrates with the OS properly (there is an API call for this in J2ME, just not Java SE) This makes me think that we could improve how Ivy deal with online/offline mode. Indeed for the moment Ivy doesn't really know which repository requires a network access and which doesn't. It would be nice if Ivy would know that, so that even if offline mode you could still use alocal repository (for antlib testing for instance). 2. when we encounter an element (or even an attr) in an unknown antlib xmlns, and we want to map that to a projectcomponent, we hand off resolution to an antlib resolver. We would have one built in (the failing resolver), would default to the ivy one if it was present, and provide some way to let people switch to a different one. This sounds like a good idea. 3. an antlib resolver would do the mapping from antlib package to artifacts (problem one), Yes, and note that we have to consider the version too. then download the metadata for that artifact, pull it down and all its artifacts 4. we would then the lib with the classpath that is set up by the resolver One question here: is it the responsibility of the resolver to keep artifacts in a cache, or put artifacts in an Ant managed cache. This is important to specify how things are going in offline mode. Ivy already has pretty good support for offline mode, and I think we could improve it (see above). But this is important to consider when specifying the role of the antlib resolver. we'd need a metadata tree mapping antlibs to well known packages, but that is not too hard. JSON, perhaps. Not too hard, except maybe for version information. I'm not sure that it would be really nice to get the latest version by default, making the build system automatically updated is not necessarily a good idea (at least users have to keep very good control over that). Xavier -steve - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- Learn Ivy at ApacheCon: http://www.eu.apachecon.com/ Manage your dependencies with Ivy! http://incubator.apache.org/ivy/ - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
re: auto download of antlibs
In an earlier message Steve Loughran said > OK, now that Ant1.6 has antlibs, it is time to think of the next step: auto download of antlibs and (perhaps) dependencies. Importer from Krysalis does exactly this. It downloads, caches and imports build.xml files that I call antlets. Some initial documentation is available at http://krysalis.org/cgi-bin/krywiki.pl?Importer This allows ant files to be a small as this. - <#> Depot Ruper is a Repository Updater <#> http://metamorphosis.krysalis.org/antlet/"; /> http://metamorphosis.krysalis.org/antlet/"; /> - <#> - <#> This will give you all the following targets. compile, dist, clean, etc. The source for the antlets are at http://cvs.sourceforge.net/viewcvs.py/metamorphosis/antlets/ I have a krysalis-importer-0.5-alpha.jar ready to go but Source Forge was down last night, so I won't be able to post until tonight. Importer is a complete rewrite from the experience gained in the centipede project. It is a small target ZERO dependencies ant task. Importer is the my take at the simplest thing that would work. More needs to be added. I hope to get the depot ruper stuff integrated to handle mirrors and versioning. R, Nick - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: auto download of antlibs
--- [EMAIL PROTECTED] wrote: > > 1. Possible requirements > > > > -allow users to specify the URLs of dependent > antlibs > > Why only AntLibs? Maybe resources in general (jars, > AntLets, ...) > There was talk at one point about Ant 2.0 (sorry) possibly incorporating commons-vfs in some way. I have kept this in the back of my mind for some time... I have only looked at vfs briefly, and it still lives in the sandbox, but might it be of use here as well to get a pretty flexible means of access to resources? One pet focus of mine is the desire to have properties usable as files in ant... if vfs would let us register custom protocols, property:myproperty could return a (vfs) FileObject storing our property's contents, assuming the property was set... does anyone else see possibilities here? -Matt __ Do you Yahoo!? Yahoo! Finance: Get your refund fast by filing online. http://taxes.yahoo.com/filing.html - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Costin Manolache wrote: Steve Loughran wrote: OK, now that Ant1.6 has antlibs, it is time to think of the next step: auto download of antlibs and (perhaps) dependencies. 1. Possible requirements -allow users to specify the URLs of dependent antlibs -allow teams to provide an override point that specifies their location -secure download -only files from trusted sources are fetched. Signed jars ? that was roughly my thought. But then you need a signature trust model with certificate handling and the like, security panics, etc etc. Having a simpler 'no security at all' option is more brutally honest and a lot easier :) But security is a big issue for behind the firewall stuff. I am setting up cruisecontrol to run against the work project we are doing (smartfrog.org), whose CVS repository is sourceforge. So now I have to worry about how to download and run arbitrary source from sforge, without giving that code arbitrary access to behind the firewall systems (you know, the ones with all the SysV source, in case some malicious build file secretly starts copying lines from sysv into the linux-64 repository). I am going to have to resort to hardware (dedicated box outside the wall) or software -a vmware configuration, maybe with something emulating a router that only routes outside both our class A subnets. (yes, *both* class A subnets :) -caching of downloads, global or per-user -go through proxies -allow antlib providers to move their files (handle redirects) Is this really needed ? Maybe not at first. But 302 redirs are very useful over time. -allow antlib providers to mirror, by having a mirror file that lists possible sources I would add: support for sourceforge-like mirrors and "click" repositories. -support private repositories (intranet/internet, https, authenticated) as well as public sources -make it easy to publish an antlib, and register it in the ant central list And if possible, a single central list :-) no, too much maintenance :) Anything else? - support for multiple repository types ? It would be really nice if the tool would be able to fetch RPM/APT dependencies ( from jpackage or a similar repo ), as well as maven and other descriptors. aah, too many features! 2. What things implement this? What do Maven and Ruper do? 3. do we want to integrate this with ant, or have some more standalone tool that can be used to keep a component repository up to date, a tool with an ant task for use in a build file. A sort of apt-get for apache stuff... I think having this bundled/integrated with ant would be an excelent idea ! I am looking at ruper. I like the GUI too -and I like the ability to say you want to subscribe to, say junit and xalan & have bits of your system kept up to date. (of course, unlike the rpm tools it is not the JRE we are maintaining, just individual projects or users) - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: auto download of antlibs
> OK, now that Ant1.6 has antlibs, it is time to think of the > next step: > auto download of antlibs and (perhaps) dependencies. > > 1. Possible requirements > > -allow users to specify the URLs of dependent antlibs Why only AntLibs? Maybe resources in general (jars, AntLets, ...) > -allow teams to provide an override point that specifies their location of course :-) I wrote a snippet for the download :-) http://gump.covalent.net/jars/latest/ant/${ant.test.lib}"/> > -secure download -only files from trusted sources are fetched. - download the MD5 file and check for consistency Have that :-) But it´s longer so see at the buttom > -caching of downloads, global or per-user or per-project > -go through proxies > -allow antlib providers to move their files (handle redirects) > -allow antlib providers to mirror, by having a mirror file that lists > possible sources > -support private repositories (intranet/internet, https, > authenticated) > as well as public sources > -make it easy to publish an antlib, and register it in the > ant central list I had only a half ear on the thread, but Forrest a feature like "autodownload skin" which sounds like the whole stuff here ... > Anything else? Be open to plug in another kind of repo. Not only file based. Maybe a scm tool (cvs, ...). > > 2. What things implement this? What do Maven and Ruper do? Yep. Talk with them. Maybe we can get their code ... or decide not to do anything ... > 3. do we want to integrate this with ant, or have some more standalone > tool that can be used to keep a component repository up to date, a tool > with an ant task for use in a build file. A sort of apt-getfor apache > stuff... More something in the middle: an AntLib. Jan check-downloads.xml -- Download ${md5.file} Download ${zip.file} ${zip.file}: just processed ${zip.file}: ok ${zip.file}: Wrong MD5 checksum !!! - expected: ${md5.valid} - actual : ${md5.actual} check-downloads.properties -- download.zip.dir=ftp://sunsite.informatik.rwth-aachen.de/pub/mirror/eclipse/ S-3.0M5-200311211210/ download.md5.dir=ftp://sunsite.informatik.rwth-aachen.de/pub/mirror/eclipse/ S-3.0M5-200311211210/checksum/ dest.dir=. file.list=eclipse-Automated-Tests-3.0M5.zip,eclipse-FTP-WebDAV-3.0M5.zip,ecl ipse-JDT-3.0M5.zip,eclipse-SDK-3.0M5-win32.zip,eclipse-examples-3.0M5-win32. zip,eclipse-examples-3.0M5.zip,eclipse-platform-3.0M5-win32.zip proxy.host=A011-34 proxy.port=8080
Re: auto download of antlibs
Steve Loughran wrote: OK, now that Ant1.6 has antlibs, it is time to think of the next step: auto download of antlibs and (perhaps) dependencies. 1. Possible requirements -allow users to specify the URLs of dependent antlibs -allow teams to provide an override point that specifies their location -secure download -only files from trusted sources are fetched. Signed jars ? -caching of downloads, global or per-user -go through proxies -allow antlib providers to move their files (handle redirects) Is this really needed ? -allow antlib providers to mirror, by having a mirror file that lists possible sources I would add: support for sourceforge-like mirrors and "click" repositories. -support private repositories (intranet/internet, https, authenticated) as well as public sources -make it easy to publish an antlib, and register it in the ant central list And if possible, a single central list :-) Anything else? - support for multiple repository types ? It would be really nice if the tool would be able to fetch RPM/APT dependencies ( from jpackage or a similar repo ), as well as maven and other descriptors. 2. What things implement this? What do Maven and Ruper do? 3. do we want to integrate this with ant, or have some more standalone tool that can be used to keep a component repository up to date, a tool with an ant task for use in a build file. A sort of apt-get for apache stuff... I think having this bundled/integrated with ant would be an excelent idea ! Costin - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: auto download of antlibs
Steve Loughran wrote: OK, now that Ant1.6 has antlibs, it is time to think of the next step: auto download of antlibs and (perhaps) dependencies. 1. Possible requirements -allow users to specify the URLs of dependent antlibs -allow teams to provide an override point that specifies their location -secure download -only files from trusted sources are fetched. -caching of downloads, global or per-user -go through proxies -allow antlib providers to move their files (handle redirects) -allow antlib providers to mirror, by having a mirror file that lists possible sources -support private repositories (intranet/internet, https, authenticated) as well as public sources -make it easy to publish an antlib, and register it in the ant central list Anything else? 2. What things implement this? What do Maven and Ruper do? Ruper does some of the above and has the goal to also do the remaining items. Just some extra insight on Ruper2 from the old site: Multiple Repos: http://www.krysalis.org/ruper/config/index.html Eclipse Plugin: http://www.krysalis.org/ruper/eclipse/index.html CLI: http://www.krysalis.org/ruper/tool/quickstart.html 3. do we want to integrate this with ant, or have some more standalone tool that can be used to keep a component repository up to date, a tool with an ant task for use in a build file. A sort of apt-get for apache stuff... You have just described Ruper that is now in the Apache Incubator Depot. In addition, I'll add that there is also a need to import more than antlibs, that is also buildfiles that can be imported with related resources. It has been proposed some time ago to bring to Ant things that Centipede provided. Most has already been integrated or is in the Incubator (Apache Depot), while what remains is in the Centipede2 proposal, AKA antlets. Antlets already work like this: http://ant.apache.org/antlets"; /> The above automatically downloads it if necessary. http://nagoya.apache.org/wiki/apachewiki.cgi?AntProjectPages/Antlet Take the above simply as a possible proposal. If instead Ant would decide to just include the code in the codebase, it would be fine too. -- Nicola Ken Barozzi [EMAIL PROTECTED] - verba volant, scripta manent - (discussions get forgotten, just code remains) - - To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]