The following ticket:
https://issues.apache.org/jira/browse/SPARK-1812
for supporting 2.11 have been marked as fixed in 1.2,
but the docs in the Spark site still say that 2.10 is required.
Thanks,
Jon
Hi all,
TLDR: running spark locally through IntelliJ IDEA Scala Console results
in java.lang.ClassNotFoundException
Long version:
I'm an algorithms developer in SupersonicAds - an ad network. We are
building a major new big data project and we are now in the process of
selecting our tech stack
(StdoutOutput)
console := {
(runMain in Compile).toTask( org.apache.spark.repl.Main
-usejavacp).value
}
On Sat, Apr 26, 2014 at 1:05 PM, Jonathan Chayat
jonatha...@supersonicads.com wrote:
Hi Michael, thanks for your prompt reply.
It seems like IntelliJ Scala Console actually runs the Scala