I'm trying to execute Spark from a Hadoop Cluster, I have created this
script to try it:

#!/bin/bash

export HADOOP_CONF_DIR=/etc/hadoop/conf
SPARK_CLASSPATH=""
for lib in `ls /user/local/etc/lib/*.jar`
do
        SPARK_CLASSPATH=$SPARK_CLASSPATH:$lib
done
/home/spark-1.1.1-bin-hadoop2.4/bin/spark-submit --name "Streaming"
--master yarn-cluster --class com.sparkstreaming.Executor --jars
$SPARK_CLASSPATH --executor-memory 10g
/user/local/etc/lib/my-spark-streaming-scala.jar

When I execute the script I get this error:

Spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" java.net.URISyntaxException: Expected
scheme name at index 0:
:/user/local/etc/lib/akka-actor_2.10-2.2.3-shaded-protobuf.jar:/user/local/etc/lib/akka-remote_2.10-..
....
....
-maths-1.2.2a.jar:/user/local/etc/lib/xmlenc-0.52.jar:/user/local/etc/lib/zkclient-0.3.jar:/user/local/etc/lib/zookeeper-3.4.5.jar
        at java.net.URI$Parser.fail(URI.java:2829)
        at java.net.URI$Parser.failExpecting(URI.java:2835)
        at java.net.URI$Parser.parse(URI.java:3027)
        at java.net.URI.<init>(URI.java:595)
        at org.apache.spark.util.Utils$.resolveURI(Utils.scala:1396)
        at 
org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1419)
        at 
org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1419)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
        at org.apache.spark.util.Utils$.resolveURIs(Utils.scala:1419)
        at 
org.apache.spark.deploy.SparkSubmitArguments.parse$1(SparkSubmitArguments.scala:308)
        at 
org.apache.spark.deploy.SparkSubmitArguments.parseOpts(SparkSubmitArguments.scala:221)
        at 
org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:65)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:70)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



Why do I get this error? I have no idea. Any clue?

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to