The version I'm using was already pre-built for Hadoop 2.3.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Yarn-java-lang-IllegalArgumentException-Invalid-rule-tp21382p21485.html
Sent from the Apache Spark User List mailing list archive at
Then your spark is not built for yarn. Try to build with
sbt/sbt -Dhadoop.version=2.3.0 -Pyarn assembly
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Yarn-java-lang-IllegalArgumentException-Invalid-rule-tp21382p21404.html
Sent from the Apache
Thanks, Ted. Kerberos is enabled on the cluster.
I'm new to the world of kerberos, so pease excuse my ignorance here. Do you
know if there are any additional steps I need to take in addition to
setting HADOOP_CONF_DIR? For instance, does hadoop.security.auth_to_local
require any specific setting
Thanks, Siddardha. I did but got the same error. Kerberos is enabled on my
cluster and I may be missing a configuration step somewhere.
--
View this message in context:
Caused by: java.lang.IllegalArgumentException: Invalid rule: L
RULE:[2:$1@$0](.*@XXXCOMPANY.COM http://xxxcompany.com/)s/(.*)@
XXXCOMPANY.COM/$1/L http://xxxcompany.com/$1/L
DEFAULT
Can you put the rule on a single line (not sure whether there is newline or
space between L and DEFAULT) ?
Looks
All,
I recently try to build Spark-1.2 on my enterprise server (which has Hadoop
2.3 with YARN). Here're the steps I followed for the build:
$ mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean package
$ export SPARK_HOME=/path/to/spark/folder
$ export