Re: Do we want to go for gitpubsub?

2015-05-03 Thread Bernd Eckenfels
Hello,

when searching around for special constructs or when I have to need to
understand some third party error messages I like to visit Java code on
the Web as it safes me the need to check it out and especially it
offers (usually) a good search and navigation.

So this is by definition not for projects I work in, and typically it
is via aggregator sides like grepcode.

It is good to have code online for search index to index. But it does
not have to be in a site build. (Unfortunatelly the usual web repo
browsers are not so good in creating type links).

For javadoc->xrefs it is most valuable for samples.

Gruss
Bernd


Am Sun, 3 May 2015
20:22:04 +0200 schrieb Andreas Gudian :

> 5 gig? wow...
> 
> I think a whole lot of of the pages is garbage that no one ever really
> reads. No, not the docs itself ;-) - I'm talking about the many many
> source xref and javadoc pages (both main and test code) And every
> single of those pages is updated on every release because they all
> contain the version number or a timestamp somewhere.
> 
> If we could stop adding source xrefs and javadocs to the site
> reports, that would already help a lot. I haven't read any javadoc
> pages in the browser for years... and when I look for source code, I
> use the quick search on github or browse through the web-cvs. When I
> actually use a library, then the sources are downloaded automatically
> anyway in the IDE - together with the Javadocs if necessary.
> 
> Or does anyone see that there's still need for those pages? Do we
> maybe have some statistics on visits to those pages?
> 
> 2015-05-03 10:52 GMT+02:00 Hervé BOUTEMY :
> 
> > Le vendredi 1 mai 2015 15:14:05 Andreas Gudian a écrit :
> > > Anyone who has ever done a release of a large plugin like
> > > surefire from a european location knows what a pain it is to
> > > svn-commit a couple of thousand files. It takes hours, but only
> > > if you're lucky and the process
> > is
> > > not interrupted by the server.
> > > I had to dance around that by zipping up the locally generated
> > > site, scp
> > it
> > > to people.a.o and commit it from there. Every time I did that, I
> > > thought about how cool it would be to just push one git-commit -
> > > it would be done in seconds.
> > +1
> >
> > >
> > > So I'd be all in favour of that. Depends on how that would be
> > > structured, though. Everything in one big repository? Or could it
> > > be sliced into
> > parts
> > > of the different areas of the sites, like main maven pages, the
> > > single-module plugins, and dedicated repositories for the bigger
> > > plugins?
> > you're pointing to the key good question
> >
> > There was a discussion last month on infrastructure mailing list
> > about eventually deprecating CMS: that is the starting point of my
> > work on .htaccess
> > and /content vs /components for merging main site content with
> > components documentations (both latest and versioned in *-archives/)
> >
> > And from the discussion I had with infra, having multiple SCM
> > entries (one for
> > the main site and one for each and every component) would cause
> > issues for infra, since it means infra would have to configure a
> > lot of *pubsub listeners:
> > it's not part of infra plan to have so much listeners, and being in
> > the loop
> > every time we add one component.
> > Notice I don't see how versioned content (in *-archives) could be
> > done: but that's another issue, perhaps we could find a strategy.
> >
> >
> > Then if we switched to git, this would be one big git repo with
> > absolutely everything in it: I counted, there are 475 000 files,
> > for 5 GB. Not sure we'll
> > have the performance improvement we expected from git...
> >
> >
> > Notice that we have curently 2 html repos: /content and /components
> > We could migrate /content, ie the main site content, to gitpubsub
> > But I think we would loose the CMS, since CMS doesn't support git,
> > AFAIK Not that I'm a big CMS fan, but what I like is the main site
> > automatic build
> > (not done from personal desktop but from buildbot [1] ), which is
> > part of the
> > CMS workflow
> >
> >
> > Then I fear the dream we all have to improve component documentation
> > publication performance won't come from this
> >
> >
> > But during the discussion with infra, this huge performance issue
> > was clearly
> > shared and there was an idea expressed by Joe Schaefer:
> > "Supporting tar/zip files with svnpubsub is straightforward with the
> > built-in
> > hook support.  Someone would just need to write a custom hook that
> > expands zip
> > files for websites (a list of where to find these files needs to be
> > supplied or
> > conventionalized).
> >
> > The way I'd envision this working is that,
> >
> > 1) projects add their compressed archives listings to a specific
> > file similar to
> > extpaths for the cms.  Each entry would be a target path and a
> > source tarball.
> >
> > 2) when a project commits a new entry with a new tarball, the hook
> > will

Re: Do we want to go for gitpubsub?

2015-05-03 Thread Andreas Gudian
5 gig? wow...

I think a whole lot of of the pages is garbage that no one ever really
reads. No, not the docs itself ;-) - I'm talking about the many many source
xref and javadoc pages (both main and test code) And every single of
those pages is updated on every release because they all contain the
version number or a timestamp somewhere.

If we could stop adding source xrefs and javadocs to the site reports, that
would already help a lot. I haven't read any javadoc pages in the browser
for years... and when I look for source code, I use the quick search on
github or browse through the web-cvs. When I actually use a library, then
the sources are downloaded automatically anyway in the IDE - together with
the Javadocs if necessary.

Or does anyone see that there's still need for those pages? Do we maybe
have some statistics on visits to those pages?

2015-05-03 10:52 GMT+02:00 Hervé BOUTEMY :

> Le vendredi 1 mai 2015 15:14:05 Andreas Gudian a écrit :
> > Anyone who has ever done a release of a large plugin like surefire from a
> > european location knows what a pain it is to svn-commit a couple of
> > thousand files. It takes hours, but only if you're lucky and the process
> is
> > not interrupted by the server.
> > I had to dance around that by zipping up the locally generated site, scp
> it
> > to people.a.o and commit it from there. Every time I did that, I thought
> > about how cool it would be to just push one git-commit - it would be done
> > in seconds.
> +1
>
> >
> > So I'd be all in favour of that. Depends on how that would be structured,
> > though. Everything in one big repository? Or could it be sliced into
> parts
> > of the different areas of the sites, like main maven pages, the
> > single-module plugins, and dedicated repositories for the bigger plugins?
> you're pointing to the key good question
>
> There was a discussion last month on infrastructure mailing list about
> eventually deprecating CMS: that is the starting point of my work on
> .htaccess
> and /content vs /components for merging main site content with components
> documentations (both latest and versioned in *-archives/)
>
> And from the discussion I had with infra, having multiple SCM entries (one
> for
> the main site and one for each and every component) would cause issues for
> infra, since it means infra would have to configure a lot of *pubsub
> listeners:
> it's not part of infra plan to have so much listeners, and being in the
> loop
> every time we add one component.
> Notice I don't see how versioned content (in *-archives) could be done: but
> that's another issue, perhaps we could find a strategy.
>
>
> Then if we switched to git, this would be one big git repo with absolutely
> everything in it: I counted, there are 475 000 files, for 5 GB. Not sure
> we'll
> have the performance improvement we expected from git...
>
>
> Notice that we have curently 2 html repos: /content and /components
> We could migrate /content, ie the main site content, to gitpubsub
> But I think we would loose the CMS, since CMS doesn't support git, AFAIK
> Not that I'm a big CMS fan, but what I like is the main site automatic
> build
> (not done from personal desktop but from buildbot [1] ), which is part of
> the
> CMS workflow
>
>
> Then I fear the dream we all have to improve component documentation
> publication performance won't come from this
>
>
> But during the discussion with infra, this huge performance issue was
> clearly
> shared and there was an idea expressed by Joe Schaefer:
> "Supporting tar/zip files with svnpubsub is straightforward with the
> built-in
> hook support.  Someone would just need to write a custom hook that expands
> zip
> files for websites (a list of where to find these files needs to be
> supplied or
> conventionalized).
>
> The way I'd envision this working is that,
>
> 1) projects add their compressed archives listings to a specific file
> similar to
> extpaths for the cms.  Each entry would be a target path and a source
> tarball.
>
> 2) when a project commits a new entry with a new tarball, the hook will
> look
> at each entry in the tarball file and compare timestamps against what's on
> disk.  If the directory is older than the tarball it nukes the directory
> and
> reexpands and resets the timestamp on the base of the tree.
>
> This of course has nothing to do with the cms.  Everything needed is
> already
> possible with svnpubsub and perhaps gitpubsub as well."
>
>
> If somebody is interested to work on this, I can give pointers and work
> with
> him: at the moment, i felt too alone on the topic to have energy to do
> anything...
>
> Regards,
>
> Hervé
>
>
> [1] http://ci.apache.org/builders/maven-site-staging
>
> >
> > 2015-04-30 21:16 GMT+02:00 Jason van Zyl :
> > > My read is this work much like Github pages works. You have a
> repository
> > > with your source and the rendered pages go into a branch. But nothing
> > > stops
> > > you from having the source in one repo and just pushing the gen

Re: Do we want to go for gitpubsub?

2015-05-03 Thread Hervé BOUTEMY
Le vendredi 1 mai 2015 15:14:05 Andreas Gudian a écrit :
> Anyone who has ever done a release of a large plugin like surefire from a
> european location knows what a pain it is to svn-commit a couple of
> thousand files. It takes hours, but only if you're lucky and the process is
> not interrupted by the server.
> I had to dance around that by zipping up the locally generated site, scp it
> to people.a.o and commit it from there. Every time I did that, I thought
> about how cool it would be to just push one git-commit - it would be done
> in seconds.
+1

> 
> So I'd be all in favour of that. Depends on how that would be structured,
> though. Everything in one big repository? Or could it be sliced into parts
> of the different areas of the sites, like main maven pages, the
> single-module plugins, and dedicated repositories for the bigger plugins?
you're pointing to the key good question

There was a discussion last month on infrastructure mailing list about 
eventually deprecating CMS: that is the starting point of my work on .htaccess 
and /content vs /components for merging main site content with components 
documentations (both latest and versioned in *-archives/)

And from the discussion I had with infra, having multiple SCM entries (one for 
the main site and one for each and every component) would cause issues for 
infra, since it means infra would have to configure a lot of *pubsub listeners: 
it's not part of infra plan to have so much listeners, and being in the loop 
every time we add one component.
Notice I don't see how versioned content (in *-archives) could be done: but 
that's another issue, perhaps we could find a strategy.


Then if we switched to git, this would be one big git repo with absolutely 
everything in it: I counted, there are 475 000 files, for 5 GB. Not sure we'll 
have the performance improvement we expected from git...


Notice that we have curently 2 html repos: /content and /components
We could migrate /content, ie the main site content, to gitpubsub
But I think we would loose the CMS, since CMS doesn't support git, AFAIK
Not that I'm a big CMS fan, but what I like is the main site automatic build 
(not done from personal desktop but from buildbot [1] ), which is part of the 
CMS workflow


Then I fear the dream we all have to improve component documentation 
publication performance won't come from this


But during the discussion with infra, this huge performance issue was clearly 
shared and there was an idea expressed by Joe Schaefer:
"Supporting tar/zip files with svnpubsub is straightforward with the built-in 
hook support.  Someone would just need to write a custom hook that expands zip 
files for websites (a list of where to find these files needs to be supplied or 
conventionalized).

The way I'd envision this working is that, 

1) projects add their compressed archives listings to a specific file similar 
to 
extpaths for the cms.  Each entry would be a target path and a source tarball.

2) when a project commits a new entry with a new tarball, the hook will look 
at each entry in the tarball file and compare timestamps against what's on 
disk.  If the directory is older than the tarball it nukes the directory and 
reexpands and resets the timestamp on the base of the tree.

This of course has nothing to do with the cms.  Everything needed is already 
possible with svnpubsub and perhaps gitpubsub as well."


If somebody is interested to work on this, I can give pointers and work with 
him: at the moment, i felt too alone on the topic to have energy to do 
anything...

Regards,

Hervé


[1] http://ci.apache.org/builders/maven-site-staging

> 
> 2015-04-30 21:16 GMT+02:00 Jason van Zyl :
> > My read is this work much like Github pages works. You have a repository
> > with your source and the rendered pages go into a branch. But nothing
> > stops
> > you from having the source in one repo and just pushing the generated
> > content to another repo. This is currently how we do the M2Eclipse site.
> > We
> > have a Jekyll site in one repository, and we have a process that runs
> > Jekyll and produces the website which gets pushed. I think if we did the
> > same it would be cool because we could have the site on Github where we
> > can
> > use Jekyll, let people make pull requests and push it back here for
> > official publishing.
> > 
> > On Apr 30, 2015, at 2:55 PM, Michael Osipov  wrote:
> > > Am 2015-04-30 um 00:31 schrieb Stephen Connolly:
> > >> http://bit.ly/1QLwWGS
> > >> 
> > >> (Source: https://twitter.com/planetapache/status/593535338074611712)
> > > 
> > > Wouldn't that imply always to clone a private copy of the entire repo
> > 
> > with autogerated stuff just to push changes back to the canonical repo?
> > 
> > > Sounds like an absolute waste of time and resources to me.
> > > 
> > > Michael
> > > 
> > > 
> > > -
> > > To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> > > For addit

Re: Do we want to go for gitpubsub?

2015-05-01 Thread Karl Heinz Marbaise

Hi,

On 5/1/15 3:14 PM, Andreas Gudian wrote:

Anyone who has ever done a release of a large plugin like surefire from a
european location knows what a pain it is to svn-commit a couple of
thousand files. It takes hours, but only if you're lucky and the process is
not interrupted by the server.


Yes made the same experience here.. ;-(.



I had to dance around that by zipping up the locally generated site, scp it
to people.a.o and commit it from there.


Made it several times and it took only a few minutes or less.but it 
is not the solution


>  Every time I did that, I thought

about how cool it would be to just push one git-commit - it would be done
in seconds.


Yeah...



So I'd be all in favour of that. Depends on how that would be structured,
though. Everything in one big repository? Or could it be sliced into parts
of the different areas of the sites, like main maven pages, the
single-module plugins, and dedicated repositories for the bigger plugins?



2015-04-30 21:16 GMT+02:00 Jason van Zyl :


My read is this work much like Github pages works. You have a repository
with your source and the rendered pages go into a branch. But nothing stops
you from having the source in one repo and just pushing the generated
content to another repo. This is currently how we do the M2Eclipse site. We
have a Jekyll site in one repository, and we have a process that runs
Jekyll and produces the website which gets pushed. I think if we did the
same it would be cool because we could have the site on Github where we can
use Jekyll, let people make pull requests and push it back here for
official publishing.

On Apr 30, 2015, at 2:55 PM, Michael Osipov  wrote:


Am 2015-04-30 um 00:31 schrieb Stephen Connolly:

http://bit.ly/1QLwWGS

(Source: https://twitter.com/planetapache/status/593535338074611712)


Wouldn't that imply always to clone a private copy of the entire repo

with autogerated stuff just to push changes back to the canonical repo?


Sounds like an absolute waste of time and resources to me.

Michael


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Thanks,

Jason

--
Jason van Zyl
Founder, Takari and Apache Maven
http://twitter.com/jvanzyl
http://twitter.com/takari_io
-

A language that doesn’t affect the way you think about programming is not
worth knowing.

  -- Alan Perlis



-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Do we want to go for gitpubsub?

2015-05-01 Thread Andreas Gudian
Anyone who has ever done a release of a large plugin like surefire from a
european location knows what a pain it is to svn-commit a couple of
thousand files. It takes hours, but only if you're lucky and the process is
not interrupted by the server.
I had to dance around that by zipping up the locally generated site, scp it
to people.a.o and commit it from there. Every time I did that, I thought
about how cool it would be to just push one git-commit - it would be done
in seconds.

So I'd be all in favour of that. Depends on how that would be structured,
though. Everything in one big repository? Or could it be sliced into parts
of the different areas of the sites, like main maven pages, the
single-module plugins, and dedicated repositories for the bigger plugins?



2015-04-30 21:16 GMT+02:00 Jason van Zyl :

> My read is this work much like Github pages works. You have a repository
> with your source and the rendered pages go into a branch. But nothing stops
> you from having the source in one repo and just pushing the generated
> content to another repo. This is currently how we do the M2Eclipse site. We
> have a Jekyll site in one repository, and we have a process that runs
> Jekyll and produces the website which gets pushed. I think if we did the
> same it would be cool because we could have the site on Github where we can
> use Jekyll, let people make pull requests and push it back here for
> official publishing.
>
> On Apr 30, 2015, at 2:55 PM, Michael Osipov  wrote:
>
> > Am 2015-04-30 um 00:31 schrieb Stephen Connolly:
> >> http://bit.ly/1QLwWGS
> >>
> >> (Source: https://twitter.com/planetapache/status/593535338074611712)
> >
> > Wouldn't that imply always to clone a private copy of the entire repo
> with autogerated stuff just to push changes back to the canonical repo?
> >
> > Sounds like an absolute waste of time and resources to me.
> >
> > Michael
> >
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> > For additional commands, e-mail: dev-h...@maven.apache.org
> >
>
> Thanks,
>
> Jason
>
> --
> Jason van Zyl
> Founder, Takari and Apache Maven
> http://twitter.com/jvanzyl
> http://twitter.com/takari_io
> -
>
> A language that doesn’t affect the way you think about programming is not
> worth knowing.
>
>  -- Alan Perlis
>
>
>
>
>
>
>
>
>
>
>
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
>
>


Re: Do we want to go for gitpubsub?

2015-04-30 Thread Jason van Zyl
My read is this work much like Github pages works. You have a repository with 
your source and the rendered pages go into a branch. But nothing stops you from 
having the source in one repo and just pushing the generated content to another 
repo. This is currently how we do the M2Eclipse site. We have a Jekyll site in 
one repository, and we have a process that runs Jekyll and produces the website 
which gets pushed. I think if we did the same it would be cool because we could 
have the site on Github where we can use Jekyll, let people make pull requests 
and push it back here for official publishing.

On Apr 30, 2015, at 2:55 PM, Michael Osipov  wrote:

> Am 2015-04-30 um 00:31 schrieb Stephen Connolly:
>> http://bit.ly/1QLwWGS
>> 
>> (Source: https://twitter.com/planetapache/status/593535338074611712)
> 
> Wouldn't that imply always to clone a private copy of the entire repo with 
> autogerated stuff just to push changes back to the canonical repo?
> 
> Sounds like an absolute waste of time and resources to me.
> 
> Michael
> 
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
> For additional commands, e-mail: dev-h...@maven.apache.org
> 

Thanks,

Jason

--
Jason van Zyl
Founder, Takari and Apache Maven
http://twitter.com/jvanzyl
http://twitter.com/takari_io
-

A language that doesn’t affect the way you think about programming is not worth 
knowing. 
 
 -- Alan Perlis













-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Re: Do we want to go for gitpubsub?

2015-04-30 Thread Michael Osipov

Am 2015-04-30 um 00:31 schrieb Stephen Connolly:

http://bit.ly/1QLwWGS

(Source: https://twitter.com/planetapache/status/593535338074611712)


Wouldn't that imply always to clone a private copy of the entire repo 
with autogerated stuff just to push changes back to the canonical repo?


Sounds like an absolute waste of time and resources to me.

Michael


-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org



Do we want to go for gitpubsub?

2015-04-29 Thread Stephen Connolly
http://bit.ly/1QLwWGS

(Source: https://twitter.com/planetapache/status/593535338074611712)


Sent from my iPad

-
To unsubscribe, e-mail: dev-unsubscr...@maven.apache.org
For additional commands, e-mail: dev-h...@maven.apache.org