Building spark with native library support

2014-03-06 Thread Alan Burlison
Hi, I've successfully built 0.9.0-incubating on Solaris using sbt, following the instructions at http://spark.incubator.apache.org/docs/latest/ and it seems to work OK. However, when I start it up I get an error about missing Hadoop native libraries. I can't find any mention of how to build

Re: Building spark with native library support

2014-03-06 Thread Matei Zaharia
Is it an error, or just a warning? In any case, you need to get those libraries from a build of Hadoop for your platform. Then add them to the SPARK_LIBRARY_PATH environment variable in conf/spark-env.sh, or to your -Djava.library.path if launching an application separately. These libraries

RE: Building spark with native library support

2014-03-06 Thread Jeyaraj, Arockia R (Arockia)
-Original Message- From: Matei Zaharia [mailto:matei.zaha...@gmail.com] Sent: Thursday, March 06, 2014 11:44 AM To: user@spark.apache.org Subject: Re: Building spark with native library support Is it an error, or just a warning? In any case, you need to get those libraries from a build

Re: Building spark with native library support

2014-03-06 Thread Alan Burlison
On 06/03/2014 18:55, Matei Zaharia wrote: For the native libraries, you can use an existing Hadoop build and just put them on the path. For linking to Hadoop, Spark grabs it through Maven, but you can do mvn install locally on your version of Hadoop to install it to your local Maven cache, and