Re: How to package multiple jars for a Hadoop job

2011-02-20 Thread Mark Kerzner
Thanks! I am using simple NetBeans scripts which I am augmenting a little, but it seems I need to use Maven anyway. Mark On Sun, Feb 20, 2011 at 8:22 PM, Jun Young Kim wrote: > hi, > > There is a maven plugin to package for a hadoop. > I think this is quite convenient tool to package for a had

Re: How to package multiple jars for a Hadoop job

2011-02-20 Thread Jun Young Kim
hi, There is a maven plugin to package for a hadoop. I think this is quite convenient tool to package for a hadoop. if you are using it, add this one to your pom.xml com.github.maven-hadoop.plugin maven-hadoop-plugin 0.20.1 your_hadoop_home_dir Junyoung Kim (juneng...@gmail.com) On 02/19

Re: How to package multiple jars for a Hadoop job

2011-02-18 Thread Mark Kerzner
Thank you, Mark On Fri, Feb 18, 2011 at 4:23 PM, Eric Sammer wrote: > Mark: > > You have a few options. You can: > > 1. Package dependent jars in a lib/ directory of the jar file. > 2. Use something like Maven's assembly plugin to build a self contained > jar. > > Either way, I'd strongly recomm

Re: How to package multiple jars for a Hadoop job

2011-02-18 Thread Eric Sammer
Mark: You have a few options. You can: 1. Package dependent jars in a lib/ directory of the jar file. 2. Use something like Maven's assembly plugin to build a self contained jar. Either way, I'd strongly recommend using something like Maven to build your artifacts so they're reproducible and in

How to package multiple jars for a Hadoop job

2011-02-18 Thread Mark Kerzner
Hi, I have a script that I use to re-package all the jars (which are output in a dist directory by NetBeans) - and it structures everything correctly into a single jar for running a MapReduce job. Here it is below, but I am not sure if it is the best practice. Besides, it hard-codes my paths. I am