I was hoping that someone could either show me best practice or just comment on 
the correctness of my approach.

We have multiple projects which run on our app server (tomcat). Therefore, we 
want our projects to share as much as possible when it comes to provided 
artifacts in order to reduce our war file size - so everyone relies on the same 
version of guava for example, which is defined in the dependency management 
section of our common-parent pom.

We have configured our app servers to include a 2nd lib folder (call it 
"common-lib") - this is where we want to put all of our projects provided 
artifacts (after wiping the files in the folder out - hence the separate 
"common-lib" folder). Tomcat is setup to include this in its classpath.

For example, our nightly build trigger would rm -f all jar files in the 
common-lib folder, then Project1 would build and push all of its provided jars 
up to common-lib, followed by Project2, and on, and on. At the end of the build 
you have a completely refreshed "common-lib" folder with all of the provided 
jars for each project. (Obviously, if this were the "live" common-lib this 
might be an issue so we have a "staging" area we actually deploy to - then in 
our tomcat startup script we replace the "live" common-lib files with those 
from the "staging" area)

Obviously, if we do not keep a tight rein on dependency versions in project 
poms this could turn into a nightmare -- but let's assume we can do that.

So here is the approach I am taking:
1. I use the maven-dependency-plugin to copy provided jars into 
"${project.build.directory}\provided" - I have tied this to 
<phase>package</phase>

2. I use the maven-antrun-plugin to scp all files in the 
"${project.build.directory}\provided" folder up to our "common-lib" folder on 
our app server - I have tied this to <phase>deploy<phase>


I am afraid that maybe our horrible past practices have blinded me from seeing 
the Maven best practice, so I am looking for a reality check. To me, this seems 
like something that people would need to do all the time - but I can't seem to 
find anything that specifically relates to this approach.


Side Note: As I was writing this I kept thinking "why not just deploy our wars 
with everything that they need, that would be much cleaner and more 
reproducible. We shouldn't care about artifact (war) file size if this happens 
in the middle of the night". Here is what I came up with -- we have one jar 
file that we build which has all of our hibernate code in it, it also generates 
all of our shared connections for all of our contexts (these are shared session 
factories). This single artifact *must* be shared - we simply cannot allow each 
war file to include its own copy - due to singleton style bootstrap of 
connections.

Is it best practice for a war to deploy with all of its dependencies?


Thanks, in advance
scott








__________ Information from ESET NOD32 Antivirus, version of virus signature 
database 9348 (20140128) __________

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@maven.apache.org
For additional commands, e-mail: users-h...@maven.apache.org

Reply via email to