Thanks.

I've set SPARK_HOME and SPARK_CONF_DIR appropriately in .bash_profile

But when I start worker like this

spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh

I still get

failed to launch org.apache.spark.deploy.worker.Worker:
                             Default is conf/spark-defaults.conf.
  15/04/27 11:51:33 DEBUG Utils: Shutdown hook called





On Mon, Apr 27, 2015 at 1:15 PM, Zoltán Zvara <zoltan.zv...@gmail.com>
wrote:

> You should distribute your configuration file to workers and set the
> appropriate environment variables, like HADOOP_HOME, SPARK_HOME,
> HADOOP_CONF_DIR, SPARK_CONF_DIR.
>
> On Mon, Apr 27, 2015 at 12:56 PM James King <jakwebin...@gmail.com> wrote:
>
>> I renamed spark-defaults.conf.template to spark-defaults.conf
>> and invoked
>>
>> spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh
>>
>> But I still get
>>
>> failed to launch org.apache.spark.deploy.worker.Worker:
>>     --properties-file FILE   Path to a custom Spark properties file.
>>                              Default is conf/spark-defaults.conf.
>>
>> But I'm thinking it should pick up the default spark-defaults.conf from
>> conf dir
>>
>> Am I expecting or doing something wrong?
>>
>> Regards
>> jk
>>
>>
>>

Reply via email to