On 30/08/2011, at 7:12 AM, Dan Armbrust wrote:

> As time goes on, the files I'm trying to upload to archiva only get bigger.
> 
> I seemed to have reached a point (1.1 GB) where nothing works.  The
> HTTP upload has quit working for me (though, I suppose it is possible
> I hit that hard coded limit again, I'm not sure what the sysadmin set
> it to when I had them change it)... and webdav doesn't really work at
> all for anything beyond small files.
> 
> It would appear I'm going to have to try to trick maven into hosting
> the file.... I was thinking something along the lines of:
> 
> 1) Upload a tiny file using the correct name / path / version etc.
> 
> 2) find the file on the host OS and manually replace the file on disk
> with the correct file.
> 
> 3) delete all of the generated checksum files
> 
> 4) ??

There's no need to do all that - if you can get the file into the right 
location on the host OS then Archiva will find it on its next scan, and can be 
configured to repair the metadata. But probably the best thing to do is get 
this in a way that Maven can still do it:
1) run the build on the repository server
2) a mounted filesystem it can deploy to
3) deploy over scp/sftp

I still believe the http should work, but it's hard to diagnose this way, as 
you've said - and at that size you definitely want it deploying from an 
extremely close network location anyway.

- Brett

--
Brett Porter
br...@apache.org
http://brettporter.wordpress.com/
http://au.linkedin.com/in/brettporter




Reply via email to