yeah it looks like this upgrade will require a little fix.
Basically, now they need implicits to WritableFactory instead of implicit
to a Writable.
It will be code compatible as long as people name stuff explicitly (like
RDD[(Int,Int)]). but if it s a generic, i dont think it will be code
compatib
lemme read this issue really quick.
This looks like a redundant double-contract. Why require implicit
conversions if they are already requiring explicit types? And vice versa.
On Sun, Mar 22, 2015 at 10:17 AM, Pat Ferrel wrote:
> Due to a bug in spark we have a nasty work around for Spark 1.2.
The error that I'm getting is:
[ERROR]
/home/andy/sandbox/mahout/spark/src/main/scala/org/apache/mahout/sparkbindings/drm/CheckpointedDrmSpark.scala:169:
error: value saveAsSequenceFile is not a member of
org.apache.mahout.sparkbindings.DrmRdd[K]
[INFO] rdd.saveAsSequenceFile(path)
We d
Due to a bug in spark we have a nasty work around for Spark 1.2.1 so I’m trying
1.3.0.
Hoever they have redesigned the rdd.saveAsSequenceFile in
SequenceFileRDDFunctions. The class now expects K and V Writables to be
supplied in the constructor:
class SequenceFileRDDFunctions[K <% Writable: Cl