[ 
https://issues.apache.org/jira/browse/SPARK-13489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15178718#comment-15178718
 ] 

Joseph K. Bradley commented on SPARK-13489:
-------------------------------------------

+1 for expanding the Python and R APIs.  [~MechCoder]'s work was great last 
year, and the expanded APIs help a lot of users.

[~vectorijk] I'd recommend doing some searching for the ML + PySpark component 
tags: 
[https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20component%20in%20(ML%2C%20MLlib)%20AND%20component%20in%20(PySpark)]
as well as ML + SparkR tags: 
[https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20component%20in%20(ML%2C%20MLlib)%20AND%20component%20in%20(SparkR)]

For Python, I'd recommend finding some major ones, but also listing one general 
item in the proposal: general Python API coverage.
For R, there are lots of missing items, so I'd recommend picking the most 
important models which are missing.

> GSoC 2016 project ideas for MLlib
> ---------------------------------
>
>                 Key: SPARK-13489
>                 URL: https://issues.apache.org/jira/browse/SPARK-13489
>             Project: Spark
>          Issue Type: Brainstorming
>          Components: ML
>            Reporter: Xiangrui Meng
>            Assignee: Xiangrui Meng
>            Priority: Minor
>
> I want to use this JIRA to collect some GSoC project ideas for MLlib. 
> Ideally, the student should have contributed to Spark. And the content of the 
> project could be divided into small functional pieces so that it won't get 
> stalled if the mentor is temporarily unavailable.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to