Hi,

Thanks for your interest in PySpark.

The first thing is to have a look at the "how to contribute" guide
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark and
filter the JIRA's using the label PySpark.

If you have your own improvement in mind, you can file your a JIRA, discuss
and then send a Pull Request

HTH

Regards.

On Fri, Jun 12, 2015 at 9:36 AM, Usman Ehtesham <uehtesha...@gmail.com>
wrote:

> Hello,
>
> I am currently taking a course in Apache Spark via EdX (
> https://www.edx.org/course/introduction-big-data-apache-spark-uc-berkeleyx-cs100-1x)
> and at the same time I try to look at the code for pySpark too. I wanted to
> ask, if ideally I would like to contribute to pyspark specifically, how can
> I do that? I do not intend to contribute to core Apache Spark any time soon
> (mainly because I do not know Scala) but I am very comfortable in Python.
>
> Any tips on how to contribute specifically to pyspark without being
> affected by other parts of Spark would be greatly appreciated.
>
> P.S.: I ask this because there is a small change/improvement I would like
> to propose. Also since I just started learning Spark, I would like to also
> read and understand the pyspark code as I learn about Spark. :)
>
> Hope to hear from you soon.
>
> Usman Ehtesham Gul
> https://github.com/ueg1990
>



-- 
Godspeed,
Manoj Kumar,
http://manojbits.wordpress.com
<http://goog_1017110195>
http://github.com/MechCoder

Reply via email to