Hi all,
Can anyone tell me how to set the native library path in Spark.
Right not I am setting it using "SPARK_LIBRARY_PATH" environmental variable
in spark-env.sh. But still no success.
I am still seeing this in spark-shell.
NativeCodeLoader: Unable to load native-hadoop library for your platf
s, only master slaves are being
> spun by mesos slaves directly.
>
>
>
>
>
> On Wed, Apr 9, 2014 at 3:08 PM, Pradeep Ch wrote:
>
>> Hi,
>>
>> I want to enable Spark Master HA in spark. Documentation specifies that
>> we can do this with the help of Zoo
Hi,
I want to enable Spark Master HA in spark. Documentation specifies that we
can do this with the help of Zookeepers. But what I am worried is how to
configure one master with the other and similarly how do workers know that
the have two masters? where do you specify the multi-master information