Hi, I'd call it a known issue on Windows, and have no solution, but using SPARK_LOCAL_HOSTNAME or SPARK_LOCAL_IP before starting pyshell to *work it around*.
I wished I had access to Win7 to work on it longer and find a decent solution (not a workaround). If you have Scala REPL, execute java.net.InetAddress.getLocalHost() that Spark executes under the covers before running into the network-related issue. Pozdrawiam, Jacek -- Jacek Laskowski | https://medium.com/@jaceklaskowski/ | http://blog.jaceklaskowski.pl Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/ Follow me at https://twitter.com/jaceklaskowski Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski On Thu, Nov 26, 2015 at 8:33 AM, Shuo Wang <shuo.x.w...@gmail.com> wrote: > I am not sure if my message is getting through the mailing list. > > After running these two lines in the Quick Start example in spark's python > shell on windows 7. > >>>> textFile = sc.textFile("README.md") > >>>> textFile.count() > > I am getting the following error: > >>>> textFile.count() > > 15/11/25 19:57:01 WARN : Your hostname, oh_t-PC resolves to a > loopback/non-reachable address: fe80:0:0:0:84b:213f:3f57:fef6%net5, but we > couldn't find any external IP address! > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "C:\spark-1.5.2-bin-hadoop2.6\python\pyspark\rdd.py", line 1006, > in count > ........ > > Any idea what is going wrong here? > -- > 王硕 > 邮箱:shuo.x.w...@gmail.com > Whatever your journey, keep walking. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org