The canonical build of Spark is done using maven, not sbt. Maven and sbt do
things a bit differently. In order to get maven and sbt to each build Spark
quite similar to the way the other does, the builds are each driven through
a customization script -- build/mvn and build/sbt respectively. A lot of
work has gone into those scripts, configurations, etc. They work. Using an
arbitrary sbt version outside of the defined Spark build process is going
to lead to a lot of issues that are completely avoided by using the
documented Spark build procedure. So, back to my original question: Why?
Why do something that doesn't work when there is a well-defined,
well-maintained, documented way to build Spark with either maven or sbt?
https://spark.apache.org/docs/latest/building-spark.html

On Mon, Oct 16, 2017 at 12:26 AM, patel kumar <patel.kumar...@gmail.com>
wrote:

>  This is not the correct way to build Spark with sbt. Why ?
>
>
> On Sun, Oct 15, 2017 at 11:54 PM, Mark Hamstra <m...@clearstorydata.com>
> wrote:
>
>> I am building Spark using build.sbt.
>>
>>
>> Which just gets me back to my original question: Why? This is not the
>> correct way to build Spark with sbt.
>>
>> On Sun, Oct 15, 2017 at 11:40 PM, patel kumar <patel.kumar...@gmail.com>
>> wrote:
>>
>>> I am building Spark using build.sbt. Details are mentioned in the
>>> original mail chain.
>>>
>>> When I was using Spark1.6 using scala 2.10, everything was working fine.
>>>
>>> Issue arises when I am updating my code to make it compatible with spark
>>> 2.1.
>>>
>>> It is failing while doing sbt assembly.
>>>
>>>
>>>
>>> On Sun, Oct 15, 2017 at 11:29 PM, Mark Hamstra <m...@clearstorydata.com>
>>> wrote:
>>>
>>>> I don't understand. Are you building Spark or something else?
>>>>
>>>> If you are trying to build Spark, it doesn't look like you are doing it
>>>> the right way. As I mentioned before, and is explained in detail in the
>>>> link I provided, building Spark with sbt is done via build/sbt, not by
>>>> directly invoking your choice of an sbt version.
>>>>
>>>> On Sun, Oct 15, 2017 at 11:26 PM, patel kumar <patel.kumar...@gmail.com
>>>> > wrote:
>>>>
>>>>> Because earlier, I was using sbt 0.13.13.1, and I was getting another
>>>>> version conflict i.e.
>>>>>
>>>>> *[error] Modules were resolved with conflicting cross-version suffixes
>>>>> in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp*
>>>>> *arkTest/}newsparktest:*
>>>>> *[error]    org.json4s:json4s-ast _2.10, _2.11*
>>>>> *[error]    org.json4s:json4s-core _2.10, _2.11*
>>>>> *[trace] Stack trace suppressed: run last *:update for the full
>>>>> output.*
>>>>> *[error] (*:update) Conflicting cross-version suffixes in:
>>>>> org.json4s:json4s-ast, org.json4s:json4s-core*
>>>>>
>>>>>
>>>>> Actually issue is because of SBT compilation. sbt 0.13.x is originally
>>>>> compiled with scala 2.10 and gives support to 2.11.x.
>>>>>
>>>>> When I execute sbt , all the default jars are automatically downloaded
>>>>> with suffuix _10 (for scala 2.10). When execute assembly json4s jars for
>>>>> 2.11 is also downloaded.
>>>>>
>>>>> I upgraded the sbt with the hope that this issue might be fixed with
>>>>> new version of sbt.
>>>>>
>>>>> Sbt 1.02 introduced the same problem for XML related jars as sbt 1.0.2
>>>>> originally compiled with scala 2.12.
>>>>>
>>>>> I looked up the sbt version which would by default download 2.11.x
>>>>> jars, but failed to find any such version.
>>>>>
>>>>> So, now I am looking for a solution by which I could override the jars
>>>>> that are downloaded by default during sbt.
>>>>>
>>>>>
>>>>> On Sun, Oct 15, 2017 at 11:03 PM, Mark Hamstra <
>>>>> m...@clearstorydata.com> wrote:
>>>>>
>>>>>> sbt version is 1.0.2.
>>>>>>
>>>>>>
>>>>>> Why?
>>>>>>
>>>>>> Building Spark with sbt is done via build/sbt, which will give you
>>>>>> sbt 0.13.11 when building Spark 2.1.0.
>>>>>>
>>>>>> https://spark.apache.org/docs/2.1.0/building-spark.html#buil
>>>>>> ding-with-sbt
>>>>>>
>>>>>>
>>>>>> On Sun, Oct 15, 2017 at 10:42 PM, patel kumar <
>>>>>> patel.kumar...@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am using CDH cluster with Spark 2.1 with Scala Version 2.11.8.
>>>>>>> sbt version is 1.0.2.
>>>>>>>
>>>>>>> While doing assembly , I am getting error as
>>>>>>> *[error] java.lang.RuntimeException: Conflicting cross-version
>>>>>>> suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.*
>>>>>>> *modules:scala-parser-combinators*
>>>>>>>
>>>>>>> I tried to override the version mismatch using dependencyOverrides
>>>>>>> and force(), but none of the solution worked.
>>>>>>>
>>>>>>> Please help me to resolve this version Conflict.
>>>>>>>
>>>>>>> Details of the configuration are mentioned below :-
>>>>>>>
>>>>>>> *build.sbt *
>>>>>>>
>>>>>>> ************************************************************
>>>>>>> ***********
>>>>>>> name := "newtest"
>>>>>>> version := "0.0.2"
>>>>>>>
>>>>>>> scalaVersion := "2.11.8"
>>>>>>>
>>>>>>> sbtPlugin := true
>>>>>>>
>>>>>>> val sparkVersion = "2.1.0"
>>>>>>>
>>>>>>> mainClass in (Compile, run) := Some("com.testpackage.sq.newsp
>>>>>>> arktest")
>>>>>>>
>>>>>>> assemblyJarName in assembly := "newtest.jar"
>>>>>>>
>>>>>>>
>>>>>>> libraryDependencies ++= Seq(
>>>>>>>    "org.apache.spark" % "spark-core_2.11" % "2.1.1" % "provided",
>>>>>>>   "org.apache.spark" % "spark-sql_2.11" % "2.1.1" % "provided",
>>>>>>>   "com.databricks" % "spark-avro_2.11" % "3.2.0",
>>>>>>>     "org.apache.spark" % "spark-hive_2.11" % "2.1.1" % "provided"
>>>>>>>    )
>>>>>>>
>>>>>>>
>>>>>>> libraryDependencies +=
>>>>>>>      "log4j" % "log4j" % "1.2.15" excludeAll(
>>>>>>>        ExclusionRule(organization = "com.sun.jdmk"),
>>>>>>>        ExclusionRule(organization = "com.sun.jmx"),
>>>>>>>        ExclusionRule(organization = "javax.jms")
>>>>>>>      )
>>>>>>>
>>>>>>> resolvers += "SparkPackages" at "https://dl.bintray.com/spark-
>>>>>>> packages/maven/"
>>>>>>> resolvers += Resolver.url("bintray-sbt-plugins", url("
>>>>>>> http://dl.bintray.com/sbt/sbt-plugin-releases";))(Resolv
>>>>>>> er.ivyStylePatterns)
>>>>>>>
>>>>>>> assemblyMergeStrategy in assembly := {
>>>>>>> case PathList("META-INF", xs @ _*) => MergeStrategy.discard
>>>>>>> case x => MergeStrategy.first
>>>>>>> }
>>>>>>>
>>>>>>> ***************************************************
>>>>>>> *plugins.sbt*
>>>>>>>
>>>>>>> ************************************
>>>>>>>
>>>>>>> dependencyOverrides += ("org.scala-lang.modules" % "scala-xml_2.11"
>>>>>>> % "1.0.4")
>>>>>>> dependencyOverrides += ("org.scala-lang.modules" %
>>>>>>> "scala-parser-combinators_2.11" % "1.0.4")
>>>>>>> addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
>>>>>>> resolvers += Resolver.url("bintray-sbt-plugins", url("
>>>>>>> https://dl.bintray.com/eed3si9n/sbt-plugins/";))(Resolve
>>>>>>> r.ivyStylePatterns)
>>>>>>>
>>>>>>> *****************************************************
>>>>>>> *Error Message after assembly*
>>>>>>>
>>>>>>> **************************************************
>>>>>>> [error] Modules were resolved with conflicting cross-version
>>>>>>> suffixes in {file:/D:/Tools/scala_ide/test_workspace/test/NewSp
>>>>>>> arkTest/}newsparktest:
>>>>>>> [error]    org.scala-lang.modules:scala-xml _2.11, _2.12
>>>>>>> [error]    org.scala-lang.modules:scala-parser-combinators _2.11,
>>>>>>> _2.12
>>>>>>> [error] java.lang.RuntimeException: Conflicting cross-version
>>>>>>> suffixes in: org.scala-lang.modules:scala-xml, org.scala-lang.
>>>>>>> modules:scala-parser-combinators
>>>>>>> [error]         at scala.sys.package$.error(package.scala:27)
>>>>>>> [error]         at sbt.librarymanagement.Conflict
>>>>>>> Warning$.processCrossVersioned(ConflictWarning.scala:39)
>>>>>>> [error]         at sbt.librarymanagement.Conflict
>>>>>>> Warning$.apply(ConflictWarning.scala:19)
>>>>>>> [error]         at sbt.Classpaths$.$anonfun$ivyBa
>>>>>>> seSettings$64(Defaults.scala:1971)
>>>>>>> [error]         at scala.Function1.$anonfun$compo
>>>>>>> se$1(Function1.scala:44)
>>>>>>> [error]         at sbt.internal.util.$tilde$great
>>>>>>> er.$anonfun$$u2219$1(TypeFunctions.scala:42)
>>>>>>> [error]         at sbt.std.Transform$$anon$4.work(System.scala:64)
>>>>>>> [error]         at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
>>>>>>> [error]         at sbt.internal.util.ErrorHandlin
>>>>>>> g$.wideConvert(ErrorHandling.scala:16)
>>>>>>> [error]         at sbt.Execute.work <http://sbt.execute.work/>
>>>>>>> (Execute.scala:266)
>>>>>>> [error]         at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
>>>>>>> [error]         at sbt.ConcurrentRestrictions$$an
>>>>>>> on$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
>>>>>>> [error]         at sbt.CompletionService$$anon$2.
>>>>>>> call(CompletionService.scala:32)
>>>>>>> [error]         at java.util.concurrent.FutureTas
>>>>>>> k.run(FutureTask.java:266)
>>>>>>> [error]         at java.util.concurrent.Executors
>>>>>>> $RunnableAdapter.call(Executors.java:511)
>>>>>>> [error]         at java.util.concurrent.FutureTas
>>>>>>> k.run(FutureTask.java:266)
>>>>>>> [error]         at java.util.concurrent.ThreadPoo
>>>>>>> lExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>>>> [error]         at java.util.concurrent.ThreadPoo
>>>>>>> lExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>>>> [error]         at java.lang.Thread.run(Thread.java:748)
>>>>>>> [error] (*:update) Conflicting cross-version suffixes in:
>>>>>>> org.scala-lang.modules:scala-xml, org.scala-lang.modules:scala-par
>>>>>>> ser-combinators
>>>>>>> [error] Total time: 413 s, completed Oct 12, 2017 3:28:02 AM
>>>>>>>
>>>>>>> **************************************************
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to