Hi folks, 

I have just upgraded to Spark 1.1.0, and try some examples like: 
    ./run-example SparkPageRank pagerank_data.txt 5 

It turns out that Spark keeps trying to connect to my name node and read the
file from HDFS other than local FS: 
    Client: Retrying connect to server: Node1/192.168.0.101:9000. Already
tried 0 time(s) 

Even if I use "file://" in my data file path, the issue still comes. This
does not happen in spark-shell. 

Is there anything I am missing in configurations or the way I specified the
path? 

Thanks, 

Max



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Example-in-local-mode-with-input-files-tp16186.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to