See https://github.com/sbt/sbt-assembly#merge-strategy

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Jul 22, 2016 at 4:23 PM, janardhan shetty
<janardhan...@gmail.com> wrote:
> Changed to sbt.0.14.3 and it gave :
>
> [info] Packaging
> /Users/jshetty/sparkApplications/MainTemplate/target/scala-2.11/maintemplate_2.11-1.0.jar
> ...
> java.util.zip.ZipException: duplicate entry: META-INF/MANIFEST.MF
>     at java.util.zip.ZipOutputStream.putNextEntry(ZipOutputStream.java:233)
>
> Do we need to create assembly.sbt file inside project directory if so what
> will the the contents of it for this config ?
>
> On Fri, Jul 22, 2016 at 5:42 AM, janardhan shetty <janardhan...@gmail.com>
> wrote:
>>
>> Is scala version also the culprit? 2.10 and 2.11.8
>>
>> Also Can you give the steps to create sbt package command just like maven
>> install from within intellij to create jar file in target directory ?
>>
>> On Jul 22, 2016 5:16 AM, "Jacek Laskowski" <ja...@japila.pl> wrote:
>>>
>>> Hi,
>>>
>>> There has never been 0.13.8 for sbt-assembly AFAIK. Use 0.14.3 and
>>> start over. See
>>>
>>> https://github.com/jaceklaskowski/spark-workshop/tree/master/solutions/spark-external-cluster-manager
>>> for a sample Scala/sbt project with Spark 2.0 RC5.
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> ----
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Fri, Jul 22, 2016 at 2:08 PM, janardhan shetty
>>> <janardhan...@gmail.com> wrote:
>>> > Hi,
>>> >
>>> > I was setting up my development environment.
>>> >
>>> > Local Mac laptop setup
>>> > IntelliJ IDEA 14CE
>>> > Scala
>>> > Sbt (Not maven)
>>> >
>>> > Error:
>>> > $ sbt package
>>> > [warn]     ::::::::::::::::::::::::::::::::::::::::::::::
>>> > [warn]     ::          UNRESOLVED DEPENDENCIES         ::
>>> > [warn]     ::::::::::::::::::::::::::::::::::::::::::::::
>>> > [warn]     :: com.eed3si9n#sbt-assembly;0.13.8: not found
>>> > [warn]     ::::::::::::::::::::::::::::::::::::::::::::::
>>> > [warn]
>>> > [warn]     Note: Some unresolved dependencies have extra attributes.
>>> > Check
>>> > that these dependencies exist with the requested attributes.
>>> > [warn]         com.eed3si9n:sbt-assembly:0.13.8 (scalaVersion=2.10,
>>> > sbtVersion=0.13)
>>> > [warn]
>>> > [warn]     Note: Unresolved dependencies path:
>>> > [warn]         com.eed3si9n:sbt-assembly:0.13.8 (scalaVersion=2.10,
>>> > sbtVersion=0.13)
>>> >
>>> > (/Users/jshetty/sparkApplications/MainTemplate/project/plugins.sbt#L2-3)
>>> > [warn]           +- default:maintemplate-build:0.1-SNAPSHOT
>>> > (scalaVersion=2.10, sbtVersion=0.13)
>>> > sbt.ResolveException: unresolved dependency:
>>> > com.eed3si9n#sbt-assembly;0.13.8: not found
>>> > sbt.ResolveException: unresolved dependency:
>>> > com.eed3si9n#sbt-assembly;0.13.8: not found
>>> >     at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:291)
>>> >     at
>>> > sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:188)
>>> >     at
>>> > sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:165)
>>> >     at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
>>> >     at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
>>> >     at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:132)
>>> >     at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
>>> >     at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
>>> >     at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
>>> >
>>> >
>>> >
>>> > build.sbt:
>>> >
>>> > name := "MainTemplate"
>>> > version := "1.0"
>>> > scalaVersion := "2.11.8"
>>> > libraryDependencies ++= {
>>> >   val sparkVersion = "2.0.0-preview"
>>> >   Seq(
>>> >     "org.apache.spark" %% "spark-core" % sparkVersion,
>>> >     "org.apache.spark" %% "spark-sql" % sparkVersion,
>>> >     "org.apache.spark" %% "spark-streaming" % sparkVersion,
>>> >     "org.apache.spark" %% "spark-mllib" % sparkVersion
>>> >   )
>>> > }
>>> >
>>> > assemblyMergeStrategy in assembly := {
>>> >   case PathList("META-INF", xs @ _*) => MergeStrategy.discard
>>> >   case x => MergeStrategy.first
>>> > }
>>> >
>>> >
>>> > plugins.sbt
>>> >
>>> > logLevel := Level.Warn
>>> > addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.8")
>>> >
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to