sam created SPARK-4491:
--------------------------

             Summary: Using sbt assembly with spark as dep requires Phd in sbt
                 Key: SPARK-4491
                 URL: https://issues.apache.org/jira/browse/SPARK-4491
             Project: Spark
          Issue Type: Question
            Reporter: sam


I get the dreaded deduplicate error from sbt.  I resolved the issue (I think, I 
managed to run the SimpleApp example) here 
http://stackoverflow.com/a/27018691/1586965

My question is, is this wise? What is wrong with changing the `deduplicate` bit 
to `first`.  Why isn't it this by default?

If this isn't the way to make it work, please could someone provide an 
explanation of the correct way with .sbt examples. Having googled, every 
example I see is different because it changes depending on what deps the person 
has ... surely there has to be an automagic way of doing it (if my way isn't 
it)?

One final point, SBT seems to be blaming Spark for causing the problem in their 
documentation: https://github.com/sbt/sbt-assembly is this fair? Is Spark doing 
something wrong in the way they build their jars? Or should SBT be renamed to 
CBT (Complicated Build Tool that will make you need Cognitive Behavioural 
Therapy after use).

NOTE: Satire JFF, really I love both SBT & Spark :)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to