Re: Pyspark not using all cores

2015-03-10 Thread Akhil Das
lating PI'))
>
> # local
> # sc = SparkContext(conf=sconf)
>
> # standalone
> sc = SparkContext("spark://:7077", conf=sconf)
>
> # yarn
> # sc = SparkContext("yarn-client", conf=sconf)
>
>     rdd_pies = sc.parallelize(range(1), 1000)
> rdd_pies.map(lambda x: pi()).collect()
> sc.stop()
>
> =
>
> Does anyone have any suggestions, or know of any config we should be
> looking
> at that could solve this problem? Does anyone else see the same problem?
>
> Any help is appreciated, Thanks.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Pyspark-not-using-all-cores-tp21989.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Pyspark not using all cores

2015-03-10 Thread htailor
 else see the same problem?

Any help is appreciated, Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Pyspark-not-using-all-cores-tp21989.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org