Re: SparkContext._lock Error

2014-11-05 Thread Davies Liu
PySpark requires Python 2.6/7.

On Wed, Nov 5, 2014 at 5:32 PM, Pagliari, Roberto
 wrote:
> I'm not on the cluster now so I cannot check. What is the minimum requirement 
> for Python?
>
> Thanks,
>
> 
> From: Davies Liu [dav...@databricks.com]
> Sent: Wednesday, November 05, 2014 7:41 PM
> To: Pagliari, Roberto
> Cc: user@spark.apache.org
> Subject: Re: SparkContext._lock Error
>
> What's the version of Python? 2.4?
>
> Davies
>
> On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
>  wrote:
>> I’m using this system
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Scala 2.9.3
>>
>> Hive 0.9.0
>>
>>
>>
>>
>>
>> With spark 1.1.0. When importing pyspark, I’m getting this error:
>>
>>
>>
>>>>> from pyspark.sql import *
>>
>> Traceback (most recent call last):
>>
>>   File "", line 1, in ?
>>
>>   File "//spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>>
>> from pyspark.context import SparkContext
>>
>>   File "//spark-1.1.0/python/pyspark/context.py", line 209
>>
>> with SparkContext._lock:
>>
>> ^
>>
>> SyntaxError: invalid syntax
>>
>>
>>
>> How do I fix it?
>>
>>
>>
>> Thank you,

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: SparkContext._lock Error

2014-11-05 Thread Pagliari, Roberto
I'm not on the cluster now so I cannot check. What is the minimum requirement 
for Python?

Thanks,


From: Davies Liu [dav...@databricks.com]
Sent: Wednesday, November 05, 2014 7:41 PM
To: Pagliari, Roberto
Cc: user@spark.apache.org
Subject: Re: SparkContext._lock Error

What's the version of Python? 2.4?

Davies

On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
 wrote:
> I’m using this system
>
>
>
> Hadoop 1.0.4
>
> Scala 2.9.3
>
> Hive 0.9.0
>
>
>
>
>
> With spark 1.1.0. When importing pyspark, I’m getting this error:
>
>
>
>>>> from pyspark.sql import *
>
> Traceback (most recent call last):
>
>   File "", line 1, in ?
>
>   File "//spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>
> from pyspark.context import SparkContext
>
>   File "//spark-1.1.0/python/pyspark/context.py", line 209
>
> with SparkContext._lock:
>
> ^
>
> SyntaxError: invalid syntax
>
>
>
> How do I fix it?
>
>
>
> Thank you,

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: SparkContext._lock Error

2014-11-05 Thread Davies Liu
What's the version of Python? 2.4?

Davies

On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
 wrote:
> I’m using this system
>
>
>
> Hadoop 1.0.4
>
> Scala 2.9.3
>
> Hive 0.9.0
>
>
>
>
>
> With spark 1.1.0. When importing pyspark, I’m getting this error:
>
>
>
 from pyspark.sql import *
>
> Traceback (most recent call last):
>
>   File "", line 1, in ?
>
>   File "//spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>
> from pyspark.context import SparkContext
>
>   File "//spark-1.1.0/python/pyspark/context.py", line 209
>
> with SparkContext._lock:
>
> ^
>
> SyntaxError: invalid syntax
>
>
>
> How do I fix it?
>
>
>
> Thank you,

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org