Hey Paul -

I'm not familiar with this parameter, on which config file does it go?

Instead, we tried setting the "java.library.path" parameter file in 
"storm.yaml" to point to "/etc/hadoop/conf" and "/etc/hbase/conf" (the location 
of the local Hadoop/HBase config files), but it didn't work...

Noam

On 20/10/14 17:26, Paul Poulosky wrote:
Couldn’t you set topology.classpath to include the location of the file?   It 
would have to be installed in the same location on every worker node.

From: Noam Cohen <noam.co...@acuityads.com<mailto:noam.co...@acuityads.com>>
Reply-To: "user@storm.apache.org<mailto:user@storm.apache.org>" 
<user@storm.apache.org<mailto:user@storm.apache.org>>
Date: Monday, October 20, 2014 at 9:07 AM
To: "user@storm.apache.org<mailto:user@storm.apache.org>" 
<user@storm.apache.org<mailto:user@storm.apache.org>>
Subject: Hadoop and HBase configuration files location (inside the topology 
jar?)

Hey guys -

Our topology writes data into both HBase and HDFS. Because of that - it has to 
find the location of the Hadoop/HBase masters in the "hadoop-site.xml" and 
"hbase-site.xml" configuration files.

For non-storm applications that we run - we simply make sure to load those 
files from the application's classpath. That way we can run the same program 
both on our lab and on our production environment and have it locate the right 
master servers based on the configuration files.

However - we couldn't find a way to do the same for Storm. We are forced to 
place the files inside the topology's JAR file and hence build two different 
JARs - one for production and one for lab.

Is there a way to make Storm load those files from the Worker server's local 
disks instead of packing them up inside the topology?

Thanks!
Noam

Reply via email to