My best guess would be a networking issue--it looks like the Python socket library isn't able to connect to whatever hostname you're providing Spark in the configuration.

On 11/18/14 9:10 AM, amin mohebbi wrote:
Hi there,

*I have already downloaded Pre-built spark-1.1.0, I want to run pyspark by try typing ./bin/pyspark but I got the following error:*
*
*







*scala shell is up and working fine*

hduser@master:~/Downloads/spark-1.1.0$ ./bin/spark-shell
Java HotSpot(TM) Client VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
.....
.....
14/11/18 04:33:13 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@master:34937/user/HeartbeatReceiver
14/11/18 04:33:13 INFO SparkILoop: Created spark context..
Spark context available as sc.

scala> hduser@master:~/Downloads/spark-1.1.0$


*
*
*But python shell does not work:*

hduser@master:~/Downloads/spark-1.1.0$
hduser@master:~/Downloads/spark-1.1.0$
hduser@master:~/Downloads/spark-1.1.0$ ./bin/pyspark
Python 2.7.3 (default, Feb 27 2014, 20:00:17)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Java HotSpot(TM) Client VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/11/18 04:36:06 INFO SecurityManager: Changing view acls to: hduser,
14/11/18 04:36:06 INFO SecurityManager: Changing modify acls to: hduser,
14/11/18 04:36:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hduser, ); users with modify permissions: Set(hduser, )
14/11/18 04:36:06 INFO Slf4jLogger: Slf4jLogger started
14/11/18 04:36:06 INFO Remoting: Starting remoting
14/11/18 04:36:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@master:52317] 14/11/18 04:36:06 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@master:52317] 14/11/18 04:36:06 INFO Utils: Successfully started service 'sparkDriver' on port 52317.
14/11/18 04:36:06 INFO SparkEnv: Registering MapOutputTracker
14/11/18 04:36:06 INFO SparkEnv: Registering BlockManagerMaster
14/11/18 04:36:06 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20141118043606-c346 14/11/18 04:36:07 INFO Utils: Successfully started service 'Connection manager for block manager' on port 47507. 14/11/18 04:36:07 INFO ConnectionManager: Bound socket to port 47507 with id = ConnectionManagerId(master,47507) 14/11/18 04:36:07 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
14/11/18 04:36:07 INFO BlockManagerMaster: Trying to register BlockManager
14/11/18 04:36:07 INFO BlockManagerMasterActor: Registering block manager master:47507 with 267.3 MB RAM
14/11/18 04:36:07 INFO BlockManagerMaster: Registered BlockManager
14/11/18 04:36:07 INFO HttpFileServer: HTTP File server directory is /tmp/spark-8b29544a-c74b-4a3e-88e0-13801c8dcc65
14/11/18 04:36:07 INFO HttpServer: Starting HTTP Server
14/11/18 04:36:07 INFO Utils: Successfully started service 'HTTP file server' on port 40029. 14/11/18 04:36:12 INFO Utils: Successfully started service 'SparkUI' on port 4040. 14/11/18 04:36:12 INFO SparkUI: Started SparkUI at http://master:4040 <http://master:4040/> 14/11/18 04:36:12 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@master:52317/user/HeartbeatReceiver 14/11/18 04:36:12 INFO SparkUI: Stopped Spark web UI at http://master:4040 <http://master:4040/>
14/11/18 04:36:12 INFO DAGScheduler: Stopping DAGScheduler
14/11/18 04:36:13 INFO MapOutputTrackerMasterActor: MapOutputTrackerActor stopped!
14/11/18 04:36:13 INFO ConnectionManager: Selector thread was interrupted!
14/11/18 04:36:13 INFO ConnectionManager: ConnectionManager stopped
14/11/18 04:36:13 INFO MemoryStore: MemoryStore cleared
14/11/18 04:36:13 INFO BlockManager: BlockManager stopped
14/11/18 04:36:13 INFO BlockManagerMaster: BlockManagerMaster stopped
14/11/18 04:36:13 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
14/11/18 04:36:13 INFO SparkContext: Successfully stopped SparkContext
14/11/18 04:36:13 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
14/11/18 04:36:13 INFO Remoting: Remoting shut down
14/11/18 04:36:13 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
Traceback (most recent call last):
File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/shell.py", line 44, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/context.py", line 107, in __init__
    conf)
File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/context.py", line 159, in _do_init
    self._accumulatorServer = accumulators._start_update_server()
File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/accumulators.py", line 251, in _start_update_server
    server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
  File "/usr/lib/python2.7/SocketServer.py", line 408, in __init__
    self.server_bind()
  File "/usr/lib/python2.7/SocketServer.py", line 419, in server_bind
    self.socket.bind(self.server_address)
  File "/usr/lib/python2.7/socket.py", line 224, in meth
    return getattr(self._sock,name)(*args)
socket.gaierror: [Errno -5] No address associated with hostname
>>> sc.parallelize(range(1000)).count()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
>>> sc
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
>>> spark
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'spark' is not defined
>>>

Best Regards

.......................................................

Amin Mohebbi

PhD candidate in Software Engineering
 at university of Malaysia

Tel : +60 18 2040 017



E-Mail : tp025...@ex.apiit.edu.my

              amin_...@me.com

Reply via email to