[ https://issues.apache.org/jira/browse/SPARK-16560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chaitanya closed SPARK-16560. ----------------------------- Resolution: Information Provided > Spark-submit fails without an error > ----------------------------------- > > Key: SPARK-16560 > URL: https://issues.apache.org/jira/browse/SPARK-16560 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 1.6.2 > Environment: Raspbian Jessie > Reporter: Chaitanya > > I used the following command to run the spark java example of wordcount:- > time spark-submit --deploy-mode cluster --master spark://192.168.0.7:7077 > --class org.apache.spark.examples.JavaWordCount > /home/pi/Desktop/example/new/target/javaword.jar /books_500.txt > I have copied the same jar file into all nodes in the same location. (Copying > into HDFS didn't work for me.) When I run it, the following is the output:- > Running Spark using the REST application submission protocol. > 16/07/14 16:32:18 INFO rest.RestSubmissionClient: Submitting a request to > launch an application in spark://192.168.0.7:7077. > 16/07/14 16:32:30 WARN rest.RestSubmissionClient: Unable to connect to server > spark://192.168.0.7:7077. > Warning: Master endpoint spark://192.168.0.7:7077 was not a REST server. > Falling back to legacy submission gateway instead. > 16/07/14 16:32:30 WARN util.Utils: Your hostname, master02 resolves to a > loopback address: 127.0.1.1; using 192.168.0.7 instead (on interface wlan0) > 16/07/14 16:32:30 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to > another address > 16/07/14 16:32:31 WARN util.NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > It just stops there, quits the job and waits for the next command on > terminal. I didn't understand this error without an error message. Help > needed please...!! -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org