SparkContext._lock Error

2014-11-05 Thread Pagliari, Roberto
I'm using this system

Hadoop 1.0.4
Scala 2.9.3
Hive 0.9.0


With spark 1.1.0. When importing pyspark, I'm getting this error:

 from pyspark.sql import *
Traceback (most recent call last):
  File stdin, line 1, in ?
  File /path/spark-1.1.0/python/pyspark/__init__.py, line 63, in ?
from pyspark.context import SparkContext
  File /path/spark-1.1.0/python/pyspark/context.py, line 209
with SparkContext._lock:
^
SyntaxError: invalid syntax

How do I fix it?

Thank you,


Re: SparkContext._lock Error

2014-11-05 Thread Davies Liu
What's the version of Python? 2.4?

Davies

On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
rpagli...@appcomsci.com wrote:
 I’m using this system



 Hadoop 1.0.4

 Scala 2.9.3

 Hive 0.9.0





 With spark 1.1.0. When importing pyspark, I’m getting this error:



 from pyspark.sql import *

 Traceback (most recent call last):

   File stdin, line 1, in ?

   File /path/spark-1.1.0/python/pyspark/__init__.py, line 63, in ?

 from pyspark.context import SparkContext

   File /path/spark-1.1.0/python/pyspark/context.py, line 209

 with SparkContext._lock:

 ^

 SyntaxError: invalid syntax



 How do I fix it?



 Thank you,

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org