Re: Does Spark Driver works with HDFS in HA mode

2014-09-29 Thread Petr Novak
Thank you. HADOOP_CONF_DIR has been missing.

On Wed, Sep 24, 2014 at 4:48 PM, Matt Narrell matt.narr...@gmail.com
wrote:

 Yes, this works.  Make sure you have HADOOP_CONF_DIR set on your Spark
 machines

 mn

 On Sep 24, 2014, at 5:35 AM, Petr Novak oss.mli...@gmail.com wrote:

 Hello,
 if our Hadoop cluster is configured with HA and fs.defaultFS points to a
 namespace instead of a namenode hostname - hdfs://namespace_name/ - then
 our Spark job fails with exception. Is there anything to configure or it is
 not implemented?


 Exception in thread main org.apache.spark.SparkException: Job aborted
 due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent
 failure: Lost task 0.3 in stage 1.0 (TID 4, hostanme):


 java.lang.IllegalArgumentException: java.net.UnknownHostException:
 *namespace_name*


 Many thanks,
 P.





Does Spark Driver works with HDFS in HA mode

2014-09-24 Thread Petr Novak
Hello,
if our Hadoop cluster is configured with HA and fs.defaultFS points to a
namespace instead of a namenode hostname - hdfs://namespace_name/ - then
our Spark job fails with exception. Is there anything to configure or it is
not implemented?


Exception in thread main org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 1.0 (TID 4, hostanme):


java.lang.IllegalArgumentException: java.net.UnknownHostException:
*namespace_name*


Many thanks,
P.


Re: Does Spark Driver works with HDFS in HA mode

2014-09-24 Thread Matt Narrell
Yes, this works.  Make sure you have HADOOP_CONF_DIR set on your Spark machines

mn

On Sep 24, 2014, at 5:35 AM, Petr Novak oss.mli...@gmail.com wrote:

 Hello,
 if our Hadoop cluster is configured with HA and fs.defaultFS points to a 
 namespace instead of a namenode hostname - hdfs://namespace_name/ - then 
 our Spark job fails with exception. Is there anything to configure or it is 
 not implemented?
 
 
 Exception in thread main org.apache.spark.SparkException: Job aborted due 
 to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: 
 Lost task 0.3 in stage 1.0 (TID 4, hostanme): 
 
 
 
 java.lang.IllegalArgumentException: java.net.UnknownHostException: 
 namespace_name
 
 
 
 Many thanks,
 P.