Re: AW: Is it possible to search maven site?

2017-07-14 Thread Jim Klo
If you're planning on using Google to index and search the maven generated 
site, you should be using a sitemap.xml for the best performance and most 
control. See https://support.google.com/webmasters/answer/156184?hl=en for 
details.

The robots.txt is honored, but really doesn't provide any control over 
indexing. Also if you want to be indexed on a regular schedule you'll need to 
register your sitemap.xml with Google using the Webmaster Tools.

Jim Klo
Senior Software Engineer
SRI International
t: @nsomnac

On Jul 14, 2017, at 12:43 AM, "g.h...@aurenz.de<mailto:g.h...@aurenz.de>" 
<g.h...@aurenz.de<mailto:g.h...@aurenz.de>> wrote:

Hello Guang,

as far as I know you need a robot.txt only if you want to control the behavior 
of web search engines.
Otherwise your website will be indexed just by the algorithm if the search 
engine.
Another point is the  tag and the information there.
I know that there are some tags which help you controlling the behavior of web 
search engines, too.

One of my projects has the Google search included:
http://javadpkg.sourceforge.net/
So you can try using it by yourself.
Maybe I should add that the project doesn't have a robot.txt as well as any 
search engine specific teags in the  tag of the pages.

Regards,
Gerrit
-Ursprüngliche Nachricht-
Von: Guang Chao [mailto:guang.chao.1...@gmail.com]
Gesendet: Freitag, 14. Juli 2017 07:02
An: Maven Users List
Betreff: Re: Is it possible to search maven site?

On Thu, Jul 13, 2017 at 8:29 PM, <g.h...@aurenz.de<mailto:g.h...@aurenz.de>> 
wrote:

Hello Bruce,

depends on what you mean. You can use the Google search:
https://maven.apache.org/skins/maven-fluido-skin/#GoogleSearch
Of course this only works if your Maven site is accessible from the
Internet.


I think some parts of the website is not searchable because of robots.txt



So far I haven't seen any plugin or whatsoever which enables some kind
of offline search.

Regards,
Gerrit
-Ursprüngliche Nachricht-
Von: Bruce Wen [mailto:crest@gmail.com]
Gesendet: Donnerstag, 13. Juli 2017 13:53
An: users@maven.apache.org<mailto:users@maven.apache.org>
Betreff: Is it possible to search maven site?

Hi All,

Is there any existing solution to search maven site? Any maven plugin
to do that?

Bruce Wen (GuangMing)

-
To unsubscribe, e-mail: 
users-unsubscr...@maven.apache.org<mailto:users-unsubscr...@maven.apache.org>
For additional commands, e-mail: 
users-h...@maven.apache.org<mailto:users-h...@maven.apache.org>




--
Guang <http://javadevnotes.com/java-left-pad-string-with-spaces-examples/>


Re: Maven password encryption by project

2017-03-17 Thread Jim Klo
As others have mentioned, you shouldn’t be storing passwords in a POM.

I as well don’t have a great corporate solution that works for secrets 
management for maven.  

My solution has been to use Environment Variables - which basically follows the 
same pattern that AWS, Docker, Vagrant and others utilize.  
The pattern requires you to define a convention so that your secret is set into 
a predictable environment variable from which you can then use within your POM, 
script, etc while the job is running.

As far as for local users - generally this can be a one time setup, then from 
there you don’t need to monkey with credentials ever again until you have to 
cycle. You shouldn’t need to share any credentials.

This works well because unless you echo the environment, secret info isn’t 
echoed into the logs. And it can be utilized for non-secret information as 
well. 


> On Mar 17, 2017, at 6:38 AM, Alix Lourme  wrote:
> 
> I'm searching the best practice for password encryption in a maven POM file 
> *by
> project*, could by used by properties (like in ANT or WAGON). Sample :
> ---
> 
>maven-antrun-plugin
>1.8
>
>
>
>
> todir="cert" trust="yes" />
>
>
> 
> ---
> 
> In this case, my *docker.password* could be a properties (pom or
> settings.xml) but must not be in clear text.
> 

Is there a reason you’ve not using an identity file instead for this situation? 
It would likely work better, and you could pass the identity file as a secret 
file, from a separate system, repository, or local configuration for running 
the build.


> The problem with Maven encryption
> :
> - I have a master password defined in *settings-security.xml* (locally) for
> my user need (like proxy password encryption in MY *settings.xml*)
> - The CI tools contains the same mechanism (own *settings-security.xml*)
> for global needs, like server encryption used in *settings.xml* for jar
> publication in repository ; and I can't retrieve this file
> 

AFAIK maven encryption only applies to  elements.  Others can chime in, 
but not sure that this would solve your specific problem anyways. Does your CI 
solution have some kind of mechanism for retrieving or providing secret 
information/files?  This seems to be the root of your problem.


> => I can't use this mechanism for password encryption who works locally and
> on the CI server.
> 
> *Is there a way to have a encryption mechanism for the project's perimeter
> ?* (and not for user's perimeter, current Maven encryption works perfectly
> for that).
> 

Environment variables can solve that, but not sure why you would want project 
level vs user level credentials.

> 
> Using -s and -gs Maven options (=> user/global settings override) could be
> a workaround but :
> - Server item definition or properties defining password must be in clear
> text
> - Using this Maven settings for each build depending the project workspace
> is a little boring

Why do you want something not boring?… usually means something that should 
always work doesn’t…. 

CI systems usually invoke with -s and -gs anyways, so I’m not sure what the big 
deal is.

The way I’ve handled this is defaults use the ~/.m2/settings.xml and the CI 
utilizes a -s flag with a provided file.

> 
> Perhaps is there a best way like a "private key by project" ... but I
> didn't found entry point about that.

I’m still not entirely sure what you mean by “per project”.  Do you mean “per 
module”?  
If you’re having to have multiple credentials for a single project/reactor 
build, It’s possible that you’re problem is your CI Job is not granular enough.


> Thanks in advance. Best regards
> *NB*: This question was firstly on stackoverflow
> ,
> but no really interest ^^.

SO question doesn’t exist - that might be why there’s no interest?





smime.p7s
Description: S/MIME cryptographic signature


Re: Overriding a profile in a child POM

2017-02-22 Thread Jim Klo
I’m not sure if it’s the correct way to do it, but I’ve done this using negated 
activation using properties

parent pom:


build-all-my-docs


   build-all-docs





child pom:

build-dev-docs


!build-all-docs




then:
mvn -Dbuild-all-docs=true help:active-profiles


will show you that the parent profile is activate and the child profile is 
inactive

and:
mvn  help:active-profiles

should show the inverse.



Jim Klo
Senior Software Engineer
Center for Software Engineering
SRI International
t.  @nsomnac

> On Feb 22, 2017, at 11:14 AM, Shahim Essaid <sha...@essaid.com> wrote:
> 
> Hi,
> 
> I tested the possibility of overriding a profile in a child POM by using
> the same profile id but it doesn't work. The profiles get merged.
> 
> Is there a simple way to fully override a profile in a child POM?
> 
> Merging (i.e. composing) profiles is very useful but I can get this
> behavior by activating different profiles (with different ids) on the same
> property. It looks like creating a child profile with the same id has
> exactly the same behavior.
> 
> I'm sure this behavior is something that can't be change at this point but
> it might be useful to be able to override a profile by id to give some
> additional options for how to design a build through inheritance. Having to
> override individual plugins and executions inherited from a parent profile
> can get a little tedious.
> 
> I don't have a specific use-case in mind. I was just trying to understand
> all my options before thinking about how to compose and override my build
> behaviors.
> 
> Best,
> Shahim



smime.p7s
Description: S/MIME cryptographic signature


Re: building platform specific eclipse RCP plugins

2016-10-13 Thread Jim Klo
I’m not sure I fully follow what’s going on, however I’m not sure what you want 
to do can be easily done in a single reactor build.  It can possibly be done, 
but you’d need break things into a lot of modules.

My first question is are you building features that are OS specific or 
agnostic?  Are your plugin modules that are OS specific, are the essentially 
the same plugin with just native bits? i.e.
Feature ABC (win32 & linux)
- Plugin 123 (win32)
- Plugin 123 (linux)

or is it:
Feature ABC
- Plugin 123 (win32)
- Plugin 456 (linux)


I’m also not quite sure what you mean by OS flag… need more info, as it’s 
unclear as to what property you expect maven to respond.

FWIW: I’ve got an RCP project that I have targets of Windows, OS X, and Linux; 
which contains a mix of native and java code.

Albeit our project is rather large… it takes 3 hours to build the native 
artifacts alone. The process I follow is:

1. Reactor build for each of the native pieces, and then convert these to an 
artifact that can be installed.
2. Reactor build for the RCP app which depends upon the native artifacts of 
which I use the maven-dependency-plugin:unpack during the prepare-package phase 
to unpack the dependencies into the right location within the plugin.
- each module handles all os platforms as needed - usually this is only 
needed in a packaging step; unless you have tests in platform independent code 
that depend upon the native bits.
- ultimately what this means that the director plugin is going to 
create multiple output folders for each architecture; hence you need to know 
which native dependency needs to be placed where.

FWIW, I’ve never had any luck in building a single OS target at a time, mostly 
because the dependencies of the product include all OS’s (and there’s not an 
easy way to filter the .product file.)

The pattern that mine follows would be more like:

Feature ABC
- Plugin 123 (osx, win32, linux)
- Plugin 456 (osx)
- Plugin 789 (platform independent)


Also you might try the Tycho List (tycho-u...@eclipse.org 
) as tycho does not exactly behave like normal 
maven in that it’s Manifest First vs POM First dependencies.  If you don’t have 
a good understanding of how that all works, you’ll have a really hard time 
getting multi-platform build with native elements working right.

- JK


> On Oct 13, 2016, at 4:18 AM, Rene Tassy  wrote:
> 
> Hi
>  
> I am currently developing an eclipse RCP application and one of my features 
> includes a few platform specific plugins.
> In other words some of those plugins have o/s flag equal to “win32” and some 
> others “linux”.
>  
> Having this flag breaks my maven build because when building the feature on 
> windows it cannot see the linux plugins and vice-versa.
> Is there something I can specify in my pom.xml so that the RCP o/s flag is 
> picked by maven ?
>  
> Here is how my pom.xml looks like : 
>  
> 
> http://maven.apache.org/POM/4.0.0 
>  
> http://maven.apache.org/xsd/maven-4.0.0.xsd 
> " 
> xmlns="http://maven.apache.org/POM/4.0.0 "
> 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance 
> ">
> 4.0.0
> group_id 
> artifact_id
> 0.0.1-SNAPSHOT
> pom
>  
> 
> 
> UTF-8
> 
>  
> 
> 
> mars
> p2
> 
> http://download.eclipse.org/releases/mars 
> 
> 
>  
> 
> eclipse-cdt
> 
> http://download.eclipse.org/tools/cdt/builds/luna/milestones/ 
> 
> p2
> 
>  
> 
> eclipse-pydev
> 
> https://dl.bintray.com/fabioz/pydev/4.5.0 
> 
> p2
> 
>  
> 
> 
>  feature1  
>  plugin1 of feature1 with windows o/s 
> flag  
>   

Re: If a repository has no metadata

2016-07-27 Thread Jim Klo
Depends upon how you are using the repo.

AFAIK, maven won't be able to determine the latest Snapshot or Release version 
of an artifact as that's stored in the metadata.

Jim Klo
Senior Software Engineer
SRI International
t: @nsomnac

On Jul 27, 2016, at 6:42 AM, Benson Margulies 
<bimargul...@gmail.com<mailto:bimargul...@gmail.com>> wrote:

We're considering a situation where we want to push artifacts to S3
following the repository pathname convention, but not bother to write
the metadata. In the current state of the universe, what will happen
when something goes to read?

-
To unsubscribe, e-mail: 
users-unsubscr...@maven.apache.org<mailto:users-unsubscr...@maven.apache.org>
For additional commands, e-mail: 
users-h...@maven.apache.org<mailto:users-h...@maven.apache.org>



maven-assembly-plugin mysterious error

2016-05-06 Thread Jim Klo
Hi,

I’m doing some refactoring of some existing projects - mostly we are moving 
from SVN to GIT and making a larger project more modular.  One piece I’m 
working on packages our native code into an assembly to be posted into our 
Artifactory so other projects can depend upon those native bits without having 
to build them.

So at this point… all I’ve done is taken the existing code and shuffled 
directories around, making them shallower, and then fixed up various scripts 
and paths to reference the new paths.  This all seems to work, however - when I 
go to build these assemblies, I receive a NullPointerError referencing the 
TarArchiver.cleanup() line 494, and no other aid as to indicate what might be 
wrong.

Can anyone provide any insight as to what might be going wrong, how I might 
better debug, etc?

The full execution log + error is below my sig.  

Thanks,

- JK


Jim Klo
Senior Software Engineer
Center for Software Engineering
SRI International
t.  @nsomnac

pom.xml
===


http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;
  xmlns="http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;>
  4.0.0
  com.sri
  xsb
  0.0.8349-SNAPSHOT
  pom

  
${project.basedir}/../../build
  


  


maven-assembly-plugin
2.5.3

  

cocoa-x64.xml

  
true



make-assembly 
package 

single 



  

  



Descriptor for cocoa-x64.xml
==
http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2;
  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
  
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2
 http://maven.apache.org/xsd/assembly-1.1.2.xsd;>
  cocoa-x64
  
tar.bz2
  
  false
  

  
${project.properties.artifact.basedir}/trunk.osx/flserver-build/ext/XSB
  ${file.separator}XSB

  


Log
===
xsb jklo$ mvn -e -X package
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 
2015-11-10T08:41:47-08:00)
Maven home: /usr/local/Cellar/maven/3.3.9/libexec
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_45.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.10.5", arch: "x86_64", family: "mac"
[DEBUG] Created new class realm maven.api
[DEBUG] Importing foreign packages into class realm maven.api
[DEBUG]   Imported: javax.enterprise.inject.* < plexus.core
[DEBUG]   Imported: javax.enterprise.util.* < plexus.core
[DEBUG]   Imported: javax.inject.* < plexus.core
[DEBUG]   Imported: org.apache.maven.* < plexus.core
[DEBUG]   Imported: org.apache.maven.artifact < plexus.core
[DEBUG]   Imported: org.apache.maven.classrealm < plexus.core
[DEBUG]   Imported: org.apache.maven.cli < plexus.core
[DEBUG]   Imported: org.apache.maven.configuration < plexus.core
[DEBUG]   Imported: org.apache.maven.exception < plexus.core
[DEBUG]   Imported: org.apache.maven.execution < plexus.core
[DEBUG]   Imported: org.apache.maven.execution.scope < plexus.core
[DEBUG]   Imported: org.apache.maven.lifecycle < plexus.core
[DEBUG]   Imported: org.apache.maven.model < plexus.core
[DEBUG]   Imported: org.apache.maven.monitor < plexus.core
[DEBUG]   Imported: org.apache.maven.plugin < plexus.core
[DEBUG]   Imported: org.apache.maven.profiles < plexus.core
[DEBUG]   Imported: org.apache.maven.project < plexus.core
[DEBUG]   Imported: org.apache.maven.reporting < plexus.core
[DEBUG]   Imported: org.apache.maven.repository < plexus.core
[DEBUG]   Imported: org.apache.maven.rtinfo < plexus.core
[DEBUG]   Imported: org.apache.maven.settings < plexus.core
[DEBUG]   Imported: org.apache.maven.toolchain < plexus.core
[DEBUG]   Imported: org.apache.maven.usability < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.* < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.authentication < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.authorization < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.events < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.observers < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.proxy < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.repository < plexus.core
[DEBUG]   Imported: org.apache.maven.wagon.resource < plexus.core
[DEBUG]   Imported: org.codehaus.classworlds < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.* < plexus.core
[DEBUG]   Imported: org.codehaus.plexus.classworlds

Re: License Auditing

2015-10-06 Thread Jim Klo
Thanks,

We’re an Artifactory shop - so no Nexus - however however Artifactory Pro has a 
comparable feature.  The issue really though is that the license management I 
need is more granular than either solution offers. From what I can tell from 
their documentation, Nexus and Artifactory both manage licensing at the 
artifact level (source and binaries) not really the individual file level. Both 
seem to rely upon a license declaration to be present for each module and not 
necessarily look at the individual license headers within source files.

Jim Klo
Senior Software Engineer
Center for Software Engineering
SRI International
t.  @nsomnac

> On Oct 6, 2015, at 7:09 AM, kemet.ctr.uh...@faa.gov wrote:
> 
> Hello Mark,
> 
> Nexus Pro Plus has that feature:  
> http://www.sonatype.com/nexus/product-overview/nexus-pro-plus
> 
> Best Regards,
> 
> G Kemet Uhuru, PMP(r), SSGBP
> SWIM COTSWG/SFDPS SW CM Lead
> Communications, Information & Network Programs, Enterprise Product Support, 
> AJM-3122
> Engility Corporation-Engineering & Program Support Services
> William J. Hughes Technical Center
> Building 316 Second Floor Cubicle 2N131 (E-13)
> Atlantic City International Airport, NJ 08405
> Office Phone: 609-485-6154
> Cell Phone: 609-254-6876
> 
> 
> 
> How many times per day do you say "like or you know" ?
> 
> http://sixminutes.dlugan.com/stop-um-uh-filler-words/
> 
> Don't make excusesMake Time!




smime.p7s
Description: S/MIME cryptographic signature


Re: License Auditing

2015-10-05 Thread Jim Klo
Thanks Curtis,

I believe you’re correct - there really exists no perfect solution doing 
continuous license management using maven, beyond some really basic stuff. 
Almost all of what exists in maven land seems to only deal with homogenous 
licensing of a module and management of module dependencies, which is likely a 
good 80% of what most need to manage. However there is a growing pattern of 
pulling small pieces of code from disparate sources (especially so at a 
research institution such as were I am) - hence there is a need to dig down 
into each file and manage each file separately - but at the same time, you need 
to have a global view across the enterprise.

So far what I’ve found FOSSology [1] which centralizes about 80% of the work I 
need, however there’s really no direct maven integration other than via exec.  
I could see possibly using a combination of approaches:
1) RAT or maven-license-plugin to ID files that just identifies missing 
licenses
2) FOSSology to generate reports and manage exceptions

However there’s no real middle ground between the two in that I really need the 
DB from FOSSology to influence RAT or the maven-license-plugin. Maybe the ideal 
thing is to figure out a way to build a maven plugin for FOSSology… 

- JK

[1] http://www.fossology.org/projects/fossology

> On Oct 5, 2015, at 2:26 PM, Curtis Rueden <ctrue...@wisc.edu> wrote:
> 
> Hi Jim,
> 
> I struggled with licensing-related tooling too when I researched it awhile
> back—and my needs were simpler than yours. We ended up using
> license-maven-plugin to programmatically manage license headers of all our
> sources, with a single header with unified copyright date range and
> contributors list, which made things much easier. It sounds like your
> licensing situation is substantially more heterogeneous.
> 
> I do not know of any excellent licensing-related tutorials for license
> management, auditing or both. Maybe you could take the bull by the horns
> and write a guide somewhere? It would surely be of great benefit to the
> Maven community.
> 
> Regards,
> Curtis
> 
> On Mon, Sep 28, 2015 at 11:13 AM, Jim Klo <jim@sri.com> wrote:
> 
>> Hi,
>> 
>> Looking for some guidance on doing some source license auditing.  My needs
>> are two fold.  I need to track down all the licenses of all our
>> dependencies, which there seems to be an abundance of plugins. But I also
>> need to audit the licenses of our committed source, as many come from open
>> and non-open projects, I need to track the individual files as well.
>> 
>> I’ve started by using Apache RAT [1], which seems to be okay for auditing
>> the source, but given that we have a significant number of modules,
>> configuration of RAT is somewhat a pain (I have a bunch of custom license
>> definitions and matchers) which seem to have to be added to every POM file
>> (doesn’t like going into the parent POM likely because of the way we are
>> using Tycho).
>> 
>> Can anyone recommend a plugin that might be better for my use case?  I’d
>> like to be able to have a single config file (or artifact) that contains
>> the license declarations, and then be able to reference that from all my
>> modules.  The Codehaus License Maven Plugin [2] seems close to what I want,
>> but I can’t seem to figure out how to get it to show me files that are
>> missing license headers or even show me a per file license summary.  If
>> anyone can point me to some examples or tutorials that explain this that
>> would be much appreciated.
>> 
>> [1]
>> http://creadur.apache.org/rat/apache-rat-plugin/examples/custom-license.html
>> [2]
>> http://www.mojohaus.org/license-maven-plugin/examples/example-thirdparty.html
>> 
>> Thanks,
>> 
>> JK
>> 
>> *Jim KloSenior Software EngineerCenter for Software EngineeringSRI
>> International*
>> *t. @nsomnac*
>> 
>> 



smime.p7s
Description: S/MIME cryptographic signature


License Auditing

2015-09-28 Thread Jim Klo
Hi,

Looking for some guidance on doing some source license auditing.  My needs are 
two fold.  I need to track down all the licenses of all our dependencies, which 
there seems to be an abundance of plugins. But I also need to audit the 
licenses of our committed source, as many come from open and non-open projects, 
I need to track the individual files as well.

I’ve started by using Apache RAT [1], which seems to be okay for auditing the 
source, but given that we have a significant number of modules, configuration 
of RAT is somewhat a pain (I have a bunch of custom license definitions and 
matchers) which seem to have to be added to every POM file (doesn’t like going 
into the parent POM likely because of the way we are  using Tycho).

Can anyone recommend a plugin that might be better for my use case?  I’d like 
to be able to have a single config file (or artifact) that contains the license 
declarations, and then be able to reference that from all my modules.  The 
Codehaus License Maven Plugin [2] seems close to what I want, but I can’t seem 
to figure out how to get it to show me files that are missing license headers 
or even show me a per file license summary.  If anyone can point me to some 
examples or tutorials that explain this that would be much appreciated.

[1] 
http://creadur.apache.org/rat/apache-rat-plugin/examples/custom-license.html 
<http://creadur.apache.org/rat/apache-rat-plugin/examples/custom-license.html>
[2] 
http://www.mojohaus.org/license-maven-plugin/examples/example-thirdparty.html 
<http://www.mojohaus.org/license-maven-plugin/examples/example-thirdparty.html>

Thanks,

JK

Jim Klo
Senior Software Engineer
Center for Software Engineering
SRI International
t.  @nsomnac



smime.p7s
Description: S/MIME cryptographic signature


Re: AW: Help needed with a strange fixed filename

2015-06-09 Thread Jim Klo
Maybe I'm missing something, but why not add an ant task or assembly plugin 
during the pre-verify phase (or other appropriate phase) to rename the artifact 
so that runtime it's the correct name for future phases?

I do a similar trick to get a pom first dependency to play nice as a manifest 
first dependency in a tycho build in a single reactor build. In my case the jar 
file needs to be a specific name so the manifest build can locate the right 
dependency.

Jim Klo
Senior Software Engineer
SRI International
t: @nsomnac

On Jun 8, 2015, at 10:51 PM, Thomas Klöber 
thomas.kloe...@securintegration.commailto:thomas.kloe...@securintegration.com
 wrote:

Hi Ron,

might have not explained it right: jarfile3.jar gets turned into 
jarfile3-x.x.x.jar due to the version number i have to supply when creating the 
artefact in nexus.

I agree it would be easier to either get rid of the version number at build 
time or at least change the naming to jarfile-3.jar.
But unfortunately the vendor refuses to change that.

Hi Curtis,

I fully agree that this is a terrible way of programming. But I asked the 
vendor why they check the file name and thay say that some other apps would 
fail if they didn't have a fixed jarfile name'. Escapes me why, but again they 
refuse to cahnge that...

Bytecode patching is a no go here :)

Thanks for all your suggestions...


-Ursprüngliche Nachricht-
Von: Ron Wheeler [mailto:rwhee...@artifact-software.com]
Gesendet: Montag, 8. Juni 2015 19:03
An: users@maven.apache.orgmailto:users@maven.apache.org
Betreff: Re: Help needed with a strange fixed filename

Can you explain how  jarfile3.jar gets turned into jarfile-3.x.x?

Lots of jar file names have numbers as the last character without that
character getting turned into a version in Nexus.

I can see how it would get loaded into Nexus as jarfile3-1.0.0 but not
jarfile-3.1.0.0

Getting rid of the version number at the end of the file name at build
time is an easier task that changing the name.

Ron



On 08/06/2015 12:44 PM, Curtis Rueden wrote:
Hi Thomas,

it's name cannot be changed because during runtime it is checked and
if changed a runtime exception is thrown
IMHO, the fact that your third party JAR does that is incredibly terrible.

Yes, we could change the code with the filename check. But I'm loath
to do it since it is a 3rd party jar file and we had to do this every
time a new version is released...
One big hammer way to work around this, and other horrible third party
behaviors, is bytecode manipulation using a library such as Javassist or
ASM. Also called runtime patching, you can make a surgical change to the
stupid exception thrown by the 3rd party library, which will be resistant
to future upgrades of that library. It does require careful use of
ClassLoaders, though. It would be much more ideal to work with the upstream
vendor/developers to fix the problem there.

Regards,
Curtis

On Mon, Jun 8, 2015 at 8:10 AM, Thomas Klöber 
thomas.kloe...@securintegration.commailto:thomas.kloe...@securintegration.com
 wrote:

Hi Karl Heinz,

thanks for your answer.

Yes, we could change the code with the filename check. But I'm loath to do
it since it is a 3rd party jar file and we had to do this every time a new
version is released...

I'm just surprised that there is no other way or means to tell Maven that
a different naming scheme should be used...

Deployment at customer site is no problem, the nexus and naming issue only
affects us during development.


-Ursprüngliche Nachricht-
Von: Karl Heinz Marbaise [mailto:khmarba...@gmx.de]
Gesendet: Freitag, 5. Juni 2015 14:34
An: Maven Users List
Betreff: Re: Help needed with a strange fixed filename

Hi Thomas,


That the file is names in Nexus is the default naming schema within a
maven repository so there is no chance to change it.

So first question: Why not changing the code which checks the filename
and follow the naming convention..?

What you can do is to get the appropriate artifact via plugin (like
maven-dependency-plugin) and rename it during the packaging of your
distribution archive (which i assume you have?) Or are we talking about
an EAR file?




On 6/5/15 1:58 PM, Thomas Klöber wrote:
Hi folks'es,

I am having some problems, getting an external jar-file into my Maven
project.
Here is the issue:

· the jar file has a fixed name, lets say jarfile3.jar (digit 3
is important!)
· it's name cannot be changed because during runtime it is
checked and if changed a runtime exception is thrown
·  if I create an artefact for it in my nexus, the file name is
changed to jarfile-3.x.x
· adding this to my pom.xml as a dependency everything builds
just fine
· however, if I run my application now, it falls over with the
above runtime exception
What would be the best way of incoorporating an external jar into my
project without having hard-coded pathnames?
We are using Eclipse Kepler as IDE and Maven 3

Thanks

Re: Apache Maven ANT - VPAT

2015-04-23 Thread Jim Klo
Actually there may be some relevance.

There are some services that could be used to automate 508 and other A11y 
compliance. For example, you could execute a workflow during the test phase 
that validates content against the IDI Web Accessibility Checker 
(http://achecker.ca/checker/index.php http://achecker.ca/checker/index.php 
and https://github.com/inclusive-design/AChecker 
https://github.com/inclusive-design/AChecker). But in this case maven doesn’t 
really have any direct connection to validating 508 - it’s just a tool that 
coordinates some other service as part of a larger workflow.

Aside from specific modules that generate web content (such as Javadoc) that is 
likely the extent to which 508 would apply.

This same statement would apply to really any configuration management solution 
like Maven (CMake, Gradle, Ivy, etc).

 On Apr 23, 2015, at 7:33 AM, Benson Margulies bimargul...@apache.org wrote:
 
 I cannot imagine how anyone could expect a command-line tool for software
 developers to be relevant to 508. We've never considered the question for
 Maven and I don't think that we ever will. You might look into how people
 treat, for example, 'make' as an model.
 
 
 
 On Thu, Apr 23, 2015 at 9:34 AM, amber.l.ak...@accenture.com wrote:
 
 Hi Apache Team – Apache Maven  ANT are 2 products being used by
 Accenture for a client project. We need your assistance in verifying the
 VPAT status for these 2 products.
 
 
 
 I found the below statement in the community discussion area. Can you
 confirm this is correct and VPAT does not actually apply to Apache? Below I
 have also included a brief description of the U.S Federal VPAT purpose.
 
 
 
 Apache doesn't have anything to do with the accessibility. It just
 provides the web pages which can be accessible or not accessible at all.508
 compliance applies to a site (or a page), not to a server.”
 
 
 
 *The purpose of the Voluntary Product Accessibility Template, or VPAT, is
 to assist Federal contracting officials and other buyers in making
 preliminary assessments regarding the availability of commercial
 “Electronic and Information Technology” products and services with features
 that support accessibility. *
 
 
 
 *The Voluntary Product Accessibility Template (VPAT) is a document which
 evaluates how accessible a particular product is according to the Section
 508 Standards. It is a self-disclosing document produced by the vendor
 which details each aspect of the Section 508 requirements and how the
 product supports each criteria.*
 
 
 
 
 
 
 
 *Amber Akins*
 
 *Client  Sales Support*
 
 *Accenture - Alliance Services*
 
 *e-mail:  amber.l.ak...@accenture.com amber.l.ak...@accenture.com*
 
 *Phone: 678-657-2250 678-657-2250*
 
 *Fax: **678-657-0043 678-657-0043*
 
 
 
 --
 
 This message is for the designated recipient only and may contain
 privileged, proprietary, or otherwise confidential information. If you have
 received it in error, please notify the sender immediately and delete the
 original. Any other use of the e-mail by you is prohibited. Where allowed
 by local law, electronic communications with Accenture and its affiliates,
 including e-mail and instant messaging (including content), may be scanned
 by our systems for the purposes of information security and assessment of
 internal compliance with Accenture policy.
 
 __
 
 www.accenture.com
 




smime.p7s
Description: S/MIME cryptographic signature


Re: Adding comments to dependencies in POM

2015-04-20 Thread Jim Klo
Comments inline below..

 On Apr 20, 2015, at 10:44 AM, Ron Wheeler rwhee...@artifact-software.com 
 wrote:
 
 RDF sounds like overkill. There is no reason why a comment could not be a URI 
 but I am not sure that you want to mandate that.
 Use Case 1 link to web resource
 dependency comment=http://blog.artifact-software.com/tech/?p=191 
 http://blog.artifact-software.com/tech/?p=191”

Having dealt with this sort of thing on other projects, linked comments sound 
good at first but ultimately become a very bad idea IMO. This creates a 
non-idempotent and brittle link scenario where the comment url is out of synch 
with the content in the POM.


 Use Case 2 lots of in-line deatils
dependency comment=added to support PDF output
groupIdorg.apache.xmlgraphics/groupId
artifactIdfop/artifactId
version comment=Can't use version 2.x see FOP-34231.0/version
optional comment=set to true to get text in black on 
 whitetrue/optional
/dependency
 
 Use case 3 reference to a full explanation in the description
 There is also the description tag which could be used to hold more details
dependency comment=See note 2  in description tag.”

I’m not sure I’m seeing a difference between UC 2  3. Unless you mean 
something more like this for UC 3:
dependency comment_ref=“note2”  !— or some XPath expression —
 ...
description comment_refid=“note2” comment=“This version doesn’t work 
for the following reasons:….” 


 IDE's could show comment attributes on tags in the POM editor or in XML 
 outline views.
 
 It seems to be a lot more flexible than adding comment tags and probably less 
 intrusive to existing plug-ins.
 

I think comment tags should still be included.  Inline is great for short 
descriptions, but nothing really beats having a tag element that doesn’t 
require a lot of XML escaping like an attribute would need.

- Jim



smime.p7s
Description: S/MIME cryptographic signature


Re: Adding comments to dependencies in POM

2015-04-20 Thread Jim Klo

 On Apr 20, 2015, at 2:38 PM, Ron Wheeler rwhee...@artifact-software.com 
 wrote:
 
 On 20/04/2015 4:55 PM, Jim Klo wrote:
 Comments inline below..
 
 On Apr 20, 2015, at 10:44 AM, Ron Wheeler rwhee...@artifact-software.com 
 mailto:rwhee...@artifact-software.com 
 mailto:rwhee...@artifact-software.com 
 mailto:rwhee...@artifact-software.com wrote:
 
 RDF sounds like overkill. There is no reason why a comment could not be a 
 URI but I am not sure that you want to mandate that.
 Use Case 1 link to web resource
 dependency comment=http://blog.artifact-software.com/tech/?p=191”
 
 Having dealt with this sort of thing on other projects, linked comments 
 sound good at first but ultimately become a very bad idea IMO. This creates 
 a non-idempotent and brittle link scenario where the comment url is out of 
 synch with the content in the POM.
 
 
 Use Case 2 lots of in-line deatils
   dependency comment=added to support PDF output
   groupIdorg.apache.xmlgraphics/groupId
   artifactIdfop/artifactId
   version comment=Can't use version 2.x see 
 FOP-34231.0/version
   optional comment=set to true to get text in black on 
 whitetrue/optional
   /dependency
 
 Use case 3 reference to a full explanation in the description
 There is also the description tag which could be used to hold more details
   dependency comment=See note 2  in description tag.”
 
 I’m not sure I’m seeing a difference between UC 2  3. Unless you mean 
 something more like this for UC 3:
 dependency comment_ref=“note2”  !— or some XPath expression —
 ...
 description comment_refid=“note2” comment=“This version doesn’t work for 
 the following reasons:….”
 
 I was thinking that the description might be descriptionNote 1: Please put 
 all dependency versions in Parent, Note 2:FOP required for PDF 
 output/description
 
 IDE's could show comment attributes on tags in the POM editor or in XML 
 outline views.
 
 It seems to be a lot more flexible than adding comment tags and probably 
 less intrusive to existing plug-ins.
 
 
 I think comment tags should still be included.  Inline is great for short 
 descriptions, but nothing really beats having a tag element that doesn’t 
 require a lot of XML escaping like an attribute would need.
 
 - Jim
 
 You are looking for a lot more machine processing that I was thinking.
 I was just considering comments as a way to tell people about the choices 
 made.
 

Actually I wasn’t looking for that - just some clarity, as I didn’t fully 
understand the deltas within one of your use cases. So was just posing a 
confirming solution. :-)  I actually dis-like using references as I proposed - 
they are a pain to follow without tooling IMO.  I would rather opt to have 
“embedded” javadoc annotations (@see, @link, @todo, etc) within the comment 
text for folks who choose to go that route.

One should never discount the need to machine process these. Consider and IDE 
or document generation at the very least. IMO, anytime you have an opportunity 
to structure a vocabulary to make it machine usable - one should choose to make 
it machine usable to support any future automation.

 Your  XML escaping note is a good point.
 The IDE will pick up invalid text so it should not be hard to avoid but for 
 people without a good IDE, they will get an error. It should only happen once 
 to each person editing a pom and POMs are not edited a lot in most projects.
 

I work with varying degrees of IDE’s daily ranging from full blown IntelliJ 
Ultimate down to a text editor.  I tend to lean more on the expectation that 
the IDE is feature limited; considering about 85% of the time - I’m using a web 
browser to review POM’s - and I wouldn’t think I’m alone in that 
categorization. Less reliance on an IDE is important.

 
 Can you give a comment tag solution for each of the use cases.

I think what you provided is sufficient for examples, however for a more 
globally portable solution, I would seriously consider using a namespace.  From 
there you could have a more targeted solution:

project xmlns:cmt=“http://maven.apache.org/comments/1.0.0 
http://maven.apache.org/comments/1.0.0”
…

version cmt:note=“Can’t use version XZY @see(http://example.com/foo-123 
http://example.com/foo-123)”

dependency
cmt:note
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec et 
vestibulum odio, at porttitor nunc. Donec pulvinar erat quis accumsan interdum. 
Nunc at viverra nunc. Mauris purus ipsum, aliquam vitae nisi eget, faucibus 
auctor leo. Duis ipsum leo, viverra eleifend viverra in, tristique vitae 
tortor. Interdum et malesuada fames ac ante ipsum primis in faucibus. 
Suspendisse tempus nisl dui, at rutrum massa porta id. Suspendisse ultricies 
est vel pretium egestas. In orci ipsum, vehicula non accumsan quis, 
pellentesque ac ipsum. Maecenas maximus ornare tempus. Fusce libero ligula, 
tempor mattis pretium et, consectetur ut sapien. Aliquam rhoncus lorem erat. 
Nullam et.
/cmt:note

Re: Can Maven be used in an nmake environment with VPATH?

2015-04-01 Thread Jim Klo
Is there a reason you cannot just use the exec plugin?  We use that to manage 
all sorts of esoteric make-like systems the have similar problems as you list.

Jim Klo
Senior Software Engineer
SRI International
t: @nsomnac

On Mar 31, 2015, at 12:24 PM, Steve Cohen 
stevec...@comcast.netmailto:stevec...@comcast.net wrote:

As I indicated in my post, we need a gradual transition, and changing the 
automated build part of it would be the last step rather than the first step.  
We want to integrate maven into an existing nmake-based process rather than 
change the entire automated build process first.

At some point down the road the goal would be to use something like 
Hudson/Jenkins and CI, but we'll never get there if we have to redo everything 
first.

On 03/31/2015 02:12 PM, 
cody.a.fy...@wellsfargo.commailto:cody.a.fy...@wellsfargo.com wrote:
Have you looked into using Jenkins or Hudson to automate those builds?

You can set up custom environment variables, different jvms, etc. etc. It can 
run any number of utilities via shell scripts, Maven, Ant, Gradle, etc. etc.

That might be a more fruitful place to start.

Cody Fyler
Lending Grid Build Team
G=Lending Grid Builds
(515) – 441 - 0814

-Original Message-
From: Steve Cohen [mailto:sco...@javactivity.org]
Sent: Tuesday, March 31, 2015 1:26 PM
To: users@maven.apache.orgmailto:users@maven.apache.org
Subject: Can Maven be used in an nmake environment with VPATH?

I work for an organization which uses an SCM/Build process based on the
following:

SCM: a ancient legacy horror of a system

Build: Alcatel-Lucent nmake

With this system the organization maintains a large suite of applications.  The 
system is monstrously inflexible and a pain to work with.  They do manage to 
have an automated build process with it, but no continuous integration.

A large proportion of the actual code built by this system is java.
Deployment is onto various servers using versions of containers such as 
weblogic, or sometimes standalone. This requires old JVMs, a few of which are 
as old as JDK-1.3, and none use a version of java that is still supported by 
Oracle (=1.7).  Deployment is done through RPMs and in some cases Solaris 
packaging.

As you might imagine, change, in such an organization is difficult.  The main 
impediment to change is the accreted base of thousands of makefiles that have 
been created over the years.

But a few intrepid (or maybe foolhardy) souls are thinking of trying.
We'd like to use maven to handle the java portion of this process.  Its 
dependency management features would be worth the effort if we could get them.  
Since replacing the whole system is not in scope, the idea is to use maven to 
handle the java compilation, archiving into jars, wars, ears, etc., while 
leaving the packaging, deployment, source control systems as they are.  
Alcatel-Lucent nmake would invoke maven as it now invokes javac, jar, etc.  If 
we can get this far, future upgrading of other portions of the system may come 
into play, but not in step 1.
Such a transition will happen gradually or not at all.

The problem is this.  Alcatel-Lucent nmake (and other versions of make such as 
GNU make) includes the concept of the VPATH, an environment variable containing 
a path (similar to PATH, etc.) along which to search for dependent source.  If 
a necessary file is not found along the first node of the path, the second is 
searched for it, then the third, etc.
Only if the full VPATH is exhausted is the dependency not satisfied and the 
build fails.  Importantly, if the dependency IS satisfied, then nodes further 
down the path are not looked at for that dependency.

There is a little tutorial here, explaining how this works:
http://nmake.alcatel-lucent.com/tutorial/s10.html

Needless to say, this is not the way Maven works, especially the compiler 
plugin, certainly not under default settings.  There is the sourcepath setting 
which invokes the -sourcepath switch on javac, which might be part of a 
solution.  There would then be a need for something that could translate the 
$VPATH envvar to a sourcepath which would need to dig down through several 
layers of a directory tree (at least they would be identical in each node -e.g. 
$NODE/$PROJECT/src/main/java) to produce a sourcepath.

I don't think this will work because if I turn on the verbose debug output, I 
see that maven is putting a path to each source file on the javac command line, 
and am guessing maven is not going to do this looking over each node of the 
VPATH.

Another option would be to pull the source from the various vpath nodes in 
reverse order and then use maven in a more normal way.  But I imagine that this 
would have negative performance consequences.

Has anyone on this list ever tried anything like this?  Or is this too big a 
hill to even contemplate climbing?

-
To unsubscribe, e-mail: 
users-unsubscr

Re: Can Maven be used in an nmake environment with VPATH?

2015-04-01 Thread Jim Klo
Don’t let me discourage you on that choice if you choose it - however it sounds 
like this might be a stopgap in a transition to a more modern solution?

From my POV, which is a defensive approach towards configuration management - 
if this is just a transitionary step to removing nmake, I would not bother 
investing that time, unless:
1) you are going to make a conscious decision on embracing and maintaining your 
modified fork for a long time because this is going to be a critical part of 
your solution.
or 
2) you were going to contribute that enhancement back to the plugin projects 
and get it adopted  - which I don’t know what the process is for that with the 
maven project (I’ve done it with other apache projects - and it’s not exactly 
an easy process)

Otherwise it would just add one more piece of forked code to your bucket list 
maintain.  We all know none of us has time, or desire usually, to manage yet 
another forked project. :)

I’m sure others have their own opinion on this, but as a maven user” and not a 
maven developer” - I personally don’t like enhancing other tools unless 
there’s an easy way for me to contribute that fix back and get it into the main 
build.

Just my 2 cents advice.

- JK


 On Apr 1, 2015, at 9:48 AM, Steve Cohen stevec...@comcast.net wrote:
 
 That sounds like it might be a possibility.  But after looking at it, my 
 initial take (which is quite possibly wrong!!) is that it might be easier to 
 extend the compiler plugin, and the jar plugin, to use the vpath.  Basically, 
 you're just writing a new SourceInclusionScanner. This seems like it would 
 have the benefit of staying within known Maven channels.
 
 On 04/01/2015 10:23 AM, Jim Klo wrote:
 Is there a reason you cannot just use the exec plugin?  We use that to 
 manage all sorts of esoteric make-like systems the have similar problems as 
 you list.
 
 Jim Klo
 Senior Software Engineer
 SRI International
 t: @nsomnac





smime.p7s
Description: S/MIME cryptographic signature