I'm using this system

Hadoop 1.0.4
Scala 2.9.3
Hive 0.9.0


With spark 1.1.0. When importing pyspark, I'm getting this error:

>>> from pyspark.sql import *
Traceback (most recent call last):
  File "<stdin>", line 1, in ?
  File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
    from pyspark.context import SparkContext
  File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
    with SparkContext._lock:
                    ^
SyntaxError: invalid syntax

How do I fix it?

Thank you,

Reply via email to