You can do it either way.  I think it depends on what works best for
your workflow and how you want to do upgrades.  In our case, we like
to upgrade hbase without having users notice.

On Tue, Mar 23, 2010 at 4:26 PM, Buttler, David <buttl...@llnl.gov> wrote:
> Hi Ryan,
> The web page you pointed at describes how to add jars to the map / reduce 
> classpath.  The primary use case described there is to add the hbase 
> configuration directory and hbase jars.  I totally agree that this is 
> critical, and I just assumed that everyone (who could) would do that.  You 
> could also add a lot of other jars if you felt that all of your map/reduce 
> jobs would need those exact dependencies, but I think the option you describe 
> here of putting them in the lib directory of your job jar is much more 
> appropriate.  There is a little bit more overhead on each job to copy a 
> potentially very fat jar to each of the nodes, but you save having to worry 
> about incompatible versions of libraries.  And, it is extremely simple to 
> create such a jar using ant. I use the following ant target with no issues 
> (of course all of the interesting locations are defined earlier in 
> properties):
>
>        <target name="jars" depends="clean, compile">
>                <mkdir dir="${jarVersions}"/>
>                <copy todir="${buildDir}">
>                        <fileset dir="${srcDir}" />
>                        <fileset dir="${resourceDir}" />
>                </copy>
>                <jar destfile="${jarname}" basedir="${buildDir}" />
>                <copy todir="${buildDir}/lib">
>                        <fileset dir="lib" />
>                </copy>
>                <copy todir="${buildDir}/conf">
>                        <fileset dir="${resourceDir}" />
>                </copy>
>                <jar destfile="${depjar}" basedir="${buildDir}" />
>        </target>
>
> Dave
>
> -----Original Message-----
> From: Ryan Rawson [mailto:ryano...@gmail.com]
> Sent: Tuesday, March 23, 2010 2:57 PM
> To: hbase-user@hadoop.apache.org
> Subject: Re: Deployment question
>
> If you do the deployment mechanism I outlined, you wont need to put
> any deps into your job jar.  These will be automatically put on your
> classpath by the hadoop map reduce framework.  If you use additional
> dependencies (eg: thrift, protobuf, etc) you will have to include
> those deps into the job jar.  Put them in lib/ in your jar and you
> will get them on your classpath at run time.
>
> good luck!
> -ryan
>
> On Tue, Mar 23, 2010 at 2:54 PM, Edward Capriolo <edlinuxg...@gmail.com> 
> wrote:
>> On Tue, Mar 23, 2010 at 5:15 PM, Ryan Rawson <ryano...@gmail.com> wrote:
>>
>>> The instructions for setting up HBase to work with Mapreduce are here:
>>>
>>>
>>> http://*hadoop.apache.org/hbase/docs/current/api/org/apache/hadoop/hbase/mapreduce/package-summary.html
>>>
>>> -ryan
>>>
>>> On Tue, Mar 23, 2010 at 2:12 PM, Buttler, David <buttl...@llnl.gov> wrote:
>>> > I am going to assume that you mean that you want to run a M/R job on
>>> HBase, or just use a client connection.
>>> > I use ant for standard client programs -- it does all of the classpath
>>> magic for me.
>>> > For M/R programs, I package up my jar (using ant) with all of my jars in
>>> a subdirectory (called lib).  This seems to work just fine.
>>> >
>>> > Dave
>>> >
>>> > -----Original Message-----
>>> > From: William Kang [mailto:weliam.cl...@gmail.com]
>>> > Sent: Tuesday, March 23, 2010 2:00 PM
>>> > To: hbase-user@hadoop.apache.org
>>> > Subject: Re: Deployment question
>>> >
>>> > Hi David,
>>> > Thanks for your reply. But what you used to configure the classpath? It
>>> is
>>> > quite tedious to configure for all the libraries needed to run the jar in
>>> > command line. Thanks.
>>> >
>>> >
>>> > William
>>> >
>>> > On Tue, Mar 23, 2010 at 8:10 AM, Buttler, David <buttl...@llnl.gov>
>>> wrote:
>>> >
>>> >> I use rsync
>>> >>
>>> >> -----Original Message-----
>>> >> From: William Kang [mailto:weliam.cl...@gmail.com]
>>> >> Sent: Monday, March 22, 2010 10:42 PM
>>> >> To: hbase-user
>>> >> Subject: Deployment question
>>> >>
>>> >> Hi,
>>> >> How do you guys deploy your Java program to the production cluster?
>>> >> I worked in Eclipse in a local linux machine in pseudo distributed mode.
>>> >> What would be the best way to deploy my program to the production
>>> cluster?
>>> >> The production cluster is not in the same subnet as the development
>>> >> machine.
>>> >> Thanks a lot.
>>> >>
>>> >>
>>> >> William
>>> >>
>>> >
>>>
>>
>> Pick your poison. long classpaths or  FatJar 's
>> http://*fjep.sourceforge.net/
>>
>
>

Reply via email to