I am running python 2.7.3 and 2.1.0 of ipython notebook. I installed spark in my home directory. '/home/felix/spark-1.1.0/python/lib/py4j-0.8.1-src.zip', '/home/felix/spark-1.1.0/python', '', '/opt/bluekai/python/src/bk', '/usr/local/lib/python2.7/dist-packages/setuptools-6.1-py2.7.egg', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-linux2', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/PIL', '/usr/lib/python2.7/dist-packages/gtk-2.0', '/usr/lib/pymodules/python2.7', '/usr/local/lib/python2.7/dist-packages/IPython/extensions', '/home/felix/.ipython' show when I print the sys path
--------------------------------------------------------------------------- ImportError Traceback (most recent call last) <ipython-input-3-0e976f4d3617> in <module>() ----> 1 from pyspark import SparkContext as sc 2 3 print sc /home/felix/spark-1.1.0/python/pyspark/__init__.py in <module>() 61 62 from pyspark.conf import SparkConf ---> 63 from pyspark.context import SparkContext 64 from pyspark.sql import SQLContext 65 from pyspark.rdd import RDD /home/felix/spark-1.1.0/python/pyspark/context.py in <module>() 23 from collections import namedtuple 24 ---> 25 from pyspark import accumulators 26 from pyspark.accumulators import Accumulator 27 from pyspark.broadcast import Broadcast ImportError: cannot import name accumulators I followed the instruction on http://blog.cloudera.com/blog/2014/08/how-to-use-ipython-notebook-with-apache-spark/ Thanks, Felix -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/cannot-import-name-accumulators-in-python-2-7-tp18015.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org