[jira] [Commented] (SPARK-20708) Make `addExclusionRules` up-to-date
[ https://issues.apache.org/jira/browse/SPARK-20708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16032475#comment-16032475 ] Burak Yavuz commented on SPARK-20708: - Resolved by https://github.com/apache/spark/pull/17947 > Make `addExclusionRules` up-to-date > --- > > Key: SPARK-20708 > URL: https://issues.apache.org/jira/browse/SPARK-20708 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.0.2, 2.1.1 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun > Fix For: 2.3.0 > > > Since SPARK-9263, `resolveMavenCoordinates` ignores Spark and Spark's > dependencies by using `addExclusionRules`. This PR aims to make > `addExclusionRules` up-to-date to neglect correctly because it fails to > neglect some components like the following. > *mllib (correct)* > {code} > $ bin/spark-shell --packages org.apache.spark:spark-mllib_2.11:2.1.1 > ... > - > | |modules|| artifacts | > | conf | number| search|dwnlded|evicted|| number|dwnlded| > - > | default | 0 | 0 | 0 | 0 || 0 | 0 | > - > {code} > *mllib-local (wrong)* > {code} > $ bin/spark-shell --packages org.apache.spark:spark-mllib-local_2.11:2.1.1 > ... > - > | |modules|| artifacts | > | conf | number| search|dwnlded|evicted|| number|dwnlded| > - > | default | 15 | 2 | 2 | 0 || 15 | 2 | > - > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-20708) Make `addExclusionRules` up-to-date
[ https://issues.apache.org/jira/browse/SPARK-20708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16006079#comment-16006079 ] Apache Spark commented on SPARK-20708: -- User 'dongjoon-hyun' has created a pull request for this issue: https://github.com/apache/spark/pull/17947 > Make `addExclusionRules` up-to-date > --- > > Key: SPARK-20708 > URL: https://issues.apache.org/jira/browse/SPARK-20708 > Project: Spark > Issue Type: Bug > Components: Spark Shell, Spark Submit >Affects Versions: 2.0.2, 2.1.1 >Reporter: Dongjoon Hyun > > Since SPARK-9263, `resolveMavenCoordinates` ignores Spark and Spark's > dependencies by using `addExclusionRules`. This PR aims to make > `addExclusionRules` up-to-date to neglect correctly because it fails to > neglect some components like the following. > *mllib (correct)* > {code} > $ bin/spark-shell --packages org.apache.spark:spark-mllib_2.11:2.1.1 > ... > - > | |modules|| artifacts | > | conf | number| search|dwnlded|evicted|| number|dwnlded| > - > | default | 0 | 0 | 0 | 0 || 0 | 0 | > - > {code} > *mllib-local (wrong)* > {code} > $ bin/spark-shell --packages org.apache.spark:spark-mllib-local_2.11:2.1.1 > ... > - > | |modules|| artifacts | > | conf | number| search|dwnlded|evicted|| number|dwnlded| > - > | default | 15 | 2 | 2 | 0 || 15 | 2 | > - > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org