Update:
The issue in my previous post was solved:
I had to change the sbt file name from project_name.sbt to build.sbt.
-
Thanks!
-Caron
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/scopt-OptionParser-tp8436p20581.html
Sent from the Apache Spark
i was using sbt package when I got this error. Then I switched to using sbt
assembly and that solved the issue. To run sbt assembly, you need to have
a file called plugins.sbt in the project root/project directory and it
has the following line:
addSbtPlugin(com.eed3si9n % sbt-assembly % 0.11.2)
Thanks for posting the solution! You can also append `% provided` to
the `spark-mllib` dependency line and remove `spark-core` (because
spark-mllib already depends on spark-core) to make the assembly jar
smaller. -Xiangrui
On Fri, Aug 8, 2014 at 10:05 AM, SK skrishna...@gmail.com wrote:
i was
Hi,
I tried to develop some code to use Logistic Regression, following the code
in BinaryClassification.scala in examples/mllib. My code compiles, but at
runtime complains that scopt/OptionParser class cannot be found. I have the
following import statement in my code:
import scopt.OptionParser