[ 
https://issues.apache.org/jira/browse/SPARK-17055?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15432427#comment-15432427
 ] 

Sean Owen commented on SPARK-17055:
-----------------------------------

>From a comment in the PR, I get it. This is not actually about labels, but 
>about some arbitrary attribute or function of each example. The purpose is to 
>group examples into train/test such that examples with the same attribute 
>value always go into the same data set. So maybe you want all examples for one 
>customer ID to go into train, or all into test, but not split across both.

This needs a different name I think because 'label' has a specific and 
different meaning, and even scikit says they want to rename it. It's coherent, 
but I still don't know how useful it is. It would need to be reconstruted for 
Spark ML.

> add labelKFold to CrossValidator
> --------------------------------
>
>                 Key: SPARK-17055
>                 URL: https://issues.apache.org/jira/browse/SPARK-17055
>             Project: Spark
>          Issue Type: New Feature
>          Components: MLlib
>            Reporter: Vincent
>            Priority: Minor
>
> Current CrossValidator only supports k-fold, which randomly divides all the 
> samples in k groups of samples. But in cases when data is gathered from 
> different subjects and we want to avoid over-fitting, we want to hold out 
> samples with certain labels from training data and put them into validation 
> fold, i.e. we want to ensure that the same label is not in both testing and 
> training sets.
> Mainstream packages like Sklearn already supports such cross validation 
> method. 
> (http://scikit-learn.org/stable/modules/generated/sklearn.cross_validation.LabelKFold.html#sklearn.cross_validation.LabelKFold)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to