[jira] [Commented] (SPARK-40334) Implement `GroupBy.prod`.

2022-09-18 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17606345#comment-17606345
 ] 

Apache Spark commented on SPARK-40334:
--

User 'ayudovin' has created a pull request for this issue:
https://github.com/apache/spark/pull/37923

> Implement `GroupBy.prod`.
> -
>
> Key: SPARK-40334
> URL: https://issues.apache.org/jira/browse/SPARK-40334
> Project: Spark
>  Issue Type: Sub-task
>  Components: Pandas API on Spark
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Assignee: Artsiom Yudovin
>Priority: Major
>
> We should implement `GroupBy.prod` for increasing pandas API coverage.
> pandas docs: 
> https://pandas.pydata.org/docs/reference/api/pandas.core.groupby.GroupBy.prod.html



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-40334) Implement `GroupBy.prod`.

2022-09-14 Thread Artsiom Yudovin (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17604841#comment-17604841
 ] 

Artsiom Yudovin commented on SPARK-40334:
-

Got you, thank you so much!

> Implement `GroupBy.prod`.
> -
>
> Key: SPARK-40334
> URL: https://issues.apache.org/jira/browse/SPARK-40334
> Project: Spark
>  Issue Type: Sub-task
>  Components: Pandas API on Spark
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Assignee: Haejoon Lee
>Priority: Major
>
> We should implement `GroupBy.prod` for increasing pandas API coverage.
> pandas docs: 
> https://pandas.pydata.org/docs/reference/api/pandas.core.groupby.GroupBy.prod.html



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-40334) Implement `GroupBy.prod`.

2022-09-13 Thread Haejoon Lee (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17603611#comment-17603611
 ] 

Haejoon Lee commented on SPARK-40334:
-

[~ayudovin] No worries! Please keep working on your work!

FYI: You can leave a comment like "I'm working on this" before you start to 
avoid conflicts :)

> Implement `GroupBy.prod`.
> -
>
> Key: SPARK-40334
> URL: https://issues.apache.org/jira/browse/SPARK-40334
> Project: Spark
>  Issue Type: Sub-task
>  Components: Pandas API on Spark
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Assignee: Haejoon Lee
>Priority: Major
>
> We should implement `GroupBy.prod` for increasing pandas API coverage.
> pandas docs: 
> https://pandas.pydata.org/docs/reference/api/pandas.core.groupby.GroupBy.prod.html



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-40334) Implement `GroupBy.prod`.

2022-09-13 Thread Artsiom Yudovin (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-40334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17603463#comment-17603463
 ] 

Artsiom Yudovin commented on SPARK-40334:
-

[~itholic], Hi, I have been started to work on this ticket 2 days ago. Does it 
make sense to continue or choose another ticket? 

> Implement `GroupBy.prod`.
> -
>
> Key: SPARK-40334
> URL: https://issues.apache.org/jira/browse/SPARK-40334
> Project: Spark
>  Issue Type: Sub-task
>  Components: Pandas API on Spark
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Assignee: Haejoon Lee
>Priority: Major
>
> We should implement `GroupBy.prod` for increasing pandas API coverage.
> pandas docs: 
> https://pandas.pydata.org/docs/reference/api/pandas.core.groupby.GroupBy.prod.html



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org