[ 
https://issues.apache.org/jira/browse/SPARK-21544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16726156#comment-16726156
 ] 

Matt Foley commented on SPARK-21544:
------------------------------------

The problem described in detail in the Description above, regarding 
spark-streaming module failing deploy due to duplicate attachment of the shaded 
test jar, was not fixed in the accepted patch (which nevertheless DID fix the 
different but similar-looking problems with sql/catalyst and sql/core).  It 
still happens with v2.4.0, when attempting to deploy to an internal repository.

Unlike the other problems fixed here, the problem with spark-streaming is not 
due to bugs in spark project poms. Rather, its root cause is MSHADE-259, which 
I'm in process of providing a PR for.  I was also unable to find a work-around 
that could be done from within spark project poms.  

For those users needing to deploy shaded test jars to internal repos, hopefully 
the maven-shade-plugin team will publish a fixed release soon.  Until then, you 
might consider using a private build of maven-shade-plugin, with the patch from 
MSHADE-259.  It turns out to be pretty simple to do.

> Test jar of some module should not install or deploy twice
> ----------------------------------------------------------
>
>                 Key: SPARK-21544
>                 URL: https://issues.apache.org/jira/browse/SPARK-21544
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>    Affects Versions: 1.6.1, 2.1.0
>            Reporter: zhoukang
>            Assignee: zhoukang
>            Priority: Minor
>             Fix For: 2.3.0
>
>
> For moudle below:
> common/network-common
> streaming
> sql/core
> sql/catalyst
> tests.jar will install or deploy twice.Like:
> {code:java}
> [INFO] Installing 
> /home/mi/Work/Spark/scala2.11/spark/streaming/target/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar
>  to 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/2.1.0-mdh2.1.0.1-SNAPSHOT/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar
> [DEBUG] Writing tracking file 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/2.1.0-mdh2.1.0.1-SNAPSHOT/_remote.repositories
> [DEBUG] Installing 
> org.apache.spark:spark-streaming_2.11:2.1.0-mdh2.1.0.1-SNAPSHOT/maven-metadata.xml
>  to 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/2.1.0-mdh2.1.0.1-SNAPSHOT/maven-metadata-local.xml
> [DEBUG] Installing org.apache.spark:spark-streaming_2.11/maven-metadata.xml 
> to 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/maven-metadata-local.xml
> [INFO] Installing 
> /home/mi/Work/Spark/scala2.11/spark/streaming/target/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar
>  to 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/2.1.0-mdh2.1.0.1-SNAPSHOT/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar
> [DEBUG] Skipped re-installing 
> /home/mi/Work/Spark/scala2.11/spark/streaming/target/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar
>  to 
> /home/mi/.m2/repository/org/apache/spark/spark-streaming_2.11/2.1.0-mdh2.1.0.1-SNAPSHOT/spark-streaming_2.11-2.1.0-mdh2.1.0.1-SNAPSHOT-tests.jar,
>  seems unchanged
> {code}
> The reason is below:
> {code:java}
> [DEBUG]   (f) artifact = 
> org.apache.spark:spark-streaming_2.11:jar:2.1.0-mdh2.1.0.1-SNAPSHOT
> [DEBUG]   (f) attachedArtifacts = 
> [org.apache.spark:spark-streaming_2.11:test-jar:tests:2.1.0-mdh2.1.0.1-SNAPSHOT,
>  org.apache.spark:spark-streaming_2.11:jar:tests:2.1.0-mdh2.1.0.1-SNAPSHOT, 
> org.apache.spark:spark
> -streaming_2.11:java-source:sources:2.1.0-mdh2.1.0.1-SNAPSHOT, 
> org.apache.spark:spark-streaming_2.11:java-source:test-sources:2.1.0-mdh2.1.0.1-SNAPSHOT,
>  org.apache.spark:spark-streaming_2.11:javadoc:javadoc:2.1.0
> -mdh2.1.0.1-SNAPSHOT]
> {code}
> when executing 'mvn deploy' to nexus during release.I will fail since release 
> nexus can not be override.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to