I recently started using Spark version 1.3.0 in standalone mode (with Scala
2.10.3), and I'm running into an odd problem. I'm loading data from a file
using sc.textFile, doing some conversion of the data, and then clustering
it. When I do this with a small file (10 lines, 9 KB), it works fine, and
I was using sbt, and I found that I actually had specified Spark 0.9.1 there.
Once I upgraded my sbt config file to use 1.3.0, and Scala to 2.10.4, the
problem went away.
Michael
--
View this message in context: