I'm using a pre-built Spark; I'm not trying to compile Spark.
The compile error appears when I try to register HighlyCompressedMapStatus
in my program:
kryo.register(classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus])
If I don't register it, I get a runtime error saying that it needs
Giving a bit more detail on the error would make it a lot easier for others
to help you out. Eg., in this case, it would have helped if included your
actual compile error.
In any case, I'm assuming your issue is b/c that class if private to
spark. You can sneak around that by using
The error is in the original post.
Here's the recipe that worked for me:
kryo.register(Class.forName(org.roaringbitmap.RoaringArray$Element))
kryo.register(classOf[Array[org.roaringbitmap.RoaringArray$Element]])
kryo.register(classOf[Array[Short]])
I'm not sure what you mean. Are you asking how you can recompile all of
spark and deploy it, instead of using one of the pre-built versions?
https://spark.apache.org/docs/latest/building-spark.html
Or are you seeing compile problems specifically w/
HighlyCompressedMapStatus? The code
Does anyone know how to get the HighlyCompressedMapStatus to compile?
I will try turning off kryo in 1.2.0 and hope things don't break. I want
to benefit from the MapOutputTracker fix in 1.2.0.
On Tue, Mar 3, 2015 at 5:41 AM, Imran Rashid iras...@cloudera.com wrote:
the scala syntax for
the scala syntax for arrays is Array[T], not T[], so you want to use
something:
kryo.register(classOf[Array[org.roaringbitmap.RoaringArray$Element]])
kryo.register(classOf[Array[Short]])
nonetheless, the spark should take care of this itself. I'll look into it
later today.
On Mon, Mar 2, 2015
I think this is a Java vs scala syntax issue. Will check.
On Thu, Feb 26, 2015 at 8:17 PM, Arun Luthra arun.lut...@gmail.com wrote:
Problem is noted here: https://issues.apache.org/jira/browse/SPARK-5949
I tried this as a workaround:
import org.apache.spark.scheduler._
import
Problem is noted here: https://issues.apache.org/jira/browse/SPARK-5949
I tried this as a workaround:
import org.apache.spark.scheduler._
import org.roaringbitmap._
...
kryo.register(classOf[org.roaringbitmap.RoaringBitmap])
kryo.register(classOf[org.roaringbitmap.RoaringArray])