samza environment variable on containers

2015-07-30 Thread Chen Song
Maybe a dumb question.

Is there a way to set an ENV for samza containers?

We want to set LD_LIBRARY_PATH to include hadoop native libs.

-- 
Chen Song


Re: samza environment variable on containers

2015-07-30 Thread Yan Fang
Hi Chen Song,

I do not think there is a way in Samza with which you can specify the ENV
for Samza container.

And currently Samza does not read the LD_LIBRARY_PATH either.

Samza only puts the files in lib/*.[jw]ar into the CLASSPATH.

Though -Djava.library.path might work,  it will cause hadoop errors. :(

Thanks,

Fang, Yan
yanfang...@gmail.com

On Thu, Jul 30, 2015 at 7:05 AM, Chen Song  wrote:

> Maybe a dumb question.
>
> Is there a way to set an ENV for samza containers?
>
> We want to set LD_LIBRARY_PATH to include hadoop native libs.
>
> --
> Chen Song
>


Re: samza environment variable on containers

2015-08-02 Thread Chen Song
Thanks Yan.

I ran into issues when testing jobs on kerberized cluster. The job reads
from HDFS and it worked well before. After testing on kerberized cluster,
the Samza container threw exception as below. I am not sure how kerberos
has anything to do with this.

java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
at
org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
at
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:176)
at
org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1916)
at
org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1811)
at
org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1760)
at
com.appnexus.data.samza.hdfs.HdfsProtobufSequenceFileReaderWriter.messages(HdfsProtobufSequenceFileReaderWriter.scala:20)
at
com.appnexus.data.samza.systems.HdfsSystemConsumer$$anonfun$poll$1$$anonfun$1.apply(HdfsSystemConsumer.scala:84)
at
com.appnexus.data.samza.systems.HdfsSystemConsumer$$anonfun$poll$1$$anonfun$1.apply(HdfsSystemConsumer.scala:76)

After googling a lot, I stumbled upon this thread,
http://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
.

If anyone has any thoughts on this error, please advise.

Chen

On Thu, Jul 30, 2015 at 5:15 PM, Yan Fang  wrote:

> Hi Chen Song,
>
> I do not think there is a way in Samza with which you can specify the ENV
> for Samza container.
>
> And currently Samza does not read the LD_LIBRARY_PATH either.
>
> Samza only puts the files in lib/*.[jw]ar into the CLASSPATH.
>
> Though -Djava.library.path might work,  it will cause hadoop errors. :(
>
> Thanks,
>
> Fang, Yan
> yanfang...@gmail.com
>
> On Thu, Jul 30, 2015 at 7:05 AM, Chen Song  wrote:
>
> > Maybe a dumb question.
> >
> > Is there a way to set an ENV for samza containers?
> >
> > We want to set LD_LIBRARY_PATH to include hadoop native libs.
> >
> > --
> > Chen Song
> >
>



-- 
Chen Song


Re: samza environment variable on containers

2015-08-03 Thread Yan Fang
Maybe @Eli Reisman can give you some insights, since he is writing the HDFS
producer. Because the exception looks like related to the hdfs consumer.

Thanks,

Fang, Yan
yanfang...@gmail.com

On Sun, Aug 2, 2015 at 12:41 PM, Chen Song  wrote:

> Thanks Yan.
>
> I ran into issues when testing jobs on kerberized cluster. The job reads
> from HDFS and it worked well before. After testing on kerberized cluster,
> the Samza container threw exception as below. I am not sure how kerberos
> has anything to do with this.
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> at
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
> at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
> at
>
> org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
> at
> org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:176)
> at
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1916)
> at
> org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1811)
> at
> org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1760)
> at
>
> com.appnexus.data.samza.hdfs.HdfsProtobufSequenceFileReaderWriter.messages(HdfsProtobufSequenceFileReaderWriter.scala:20)
> at
>
> com.appnexus.data.samza.systems.HdfsSystemConsumer$$anonfun$poll$1$$anonfun$1.apply(HdfsSystemConsumer.scala:84)
> at
>
> com.appnexus.data.samza.systems.HdfsSystemConsumer$$anonfun$poll$1$$anonfun$1.apply(HdfsSystemConsumer.scala:76)
>
> After googling a lot, I stumbled upon this thread,
>
> http://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
> .
>
> If anyone has any thoughts on this error, please advise.
>
> Chen
>
> On Thu, Jul 30, 2015 at 5:15 PM, Yan Fang  wrote:
>
> > Hi Chen Song,
> >
> > I do not think there is a way in Samza with which you can specify the ENV
> > for Samza container.
> >
> > And currently Samza does not read the LD_LIBRARY_PATH either.
> >
> > Samza only puts the files in lib/*.[jw]ar into the CLASSPATH.
> >
> > Though -Djava.library.path might work,  it will cause hadoop errors. :(
> >
> > Thanks,
> >
> > Fang, Yan
> > yanfang...@gmail.com
> >
> > On Thu, Jul 30, 2015 at 7:05 AM, Chen Song 
> wrote:
> >
> > > Maybe a dumb question.
> > >
> > > Is there a way to set an ENV for samza containers?
> > >
> > > We want to set LD_LIBRARY_PATH to include hadoop native libs.
> > >
> > > --
> > > Chen Song
> > >
> >
>
>
>
> --
> Chen Song
>