Giving a bit more detail on the error would make it a lot easier for others
to help you out. Eg., in this case, it would have helped if included your
actual compile error.

In any case, I'm assuming your issue is b/c that class if private to
spark.  You can sneak around that by using
Class.forName("stringOfClassName") instead:

scala> classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus]
> <console>:8: error: class HighlyCompressedMapStatus in package scheduler
> cannot be accessed in package org.apache.spark.scheduler
>               classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus]
>                                                  ^
> scala>
> Class.forName("org.apache.spark.scheduler.HighlyCompressedMapStatus")
> res1: Class[_] = class org.apache.spark.scheduler.HighlyCompressedMapStatus



hope this helps,
Imran


On Thu, Mar 12, 2015 at 12:47 PM, Arun Luthra <arun.lut...@gmail.com> wrote:

> I'm using a pre-built Spark; I'm not trying to compile Spark.
>
> The compile error appears when I try to register HighlyCompressedMapStatus
> in my program:
>
> kryo.register(classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus])
>
> If I don't register it, I get a runtime error saying that it needs to be
> registered (the error is only when I turn on kryo).
>
> However the code is running smoothly with kryo turned off.
>
> On Wed, Mar 11, 2015 at 5:38 PM, Imran Rashid <iras...@cloudera.com>
> wrote:
>
>> I'm not sure what you mean.   Are you asking how you can recompile all of
>> spark and deploy it, instead of using one of the pre-built versions?
>>
>> https://spark.apache.org/docs/latest/building-spark.html
>>
>> Or are you seeing compile problems specifically w/
>> HighlyCompressedMapStatus?   The code compiles fine, so I'm not sure what
>> problem you are running into -- we'd need a lot more info to help
>>
>> On Tue, Mar 10, 2015 at 6:54 PM, Arun Luthra <arun.lut...@gmail.com>
>> wrote:
>>
>>> Does anyone know how to get the HighlyCompressedMapStatus to compile?
>>>
>>> I will try turning off kryo in 1.2.0 and hope things don't break.  I
>>> want to benefit from the MapOutputTracker fix in 1.2.0.
>>>
>>> On Tue, Mar 3, 2015 at 5:41 AM, Imran Rashid <iras...@cloudera.com>
>>> wrote:
>>>
>>>> the scala syntax for arrays is Array[T], not T[], so you want to use
>>>> something:
>>>>
>>>> kryo.register(classOf[Array[org.roaringbitmap.RoaringArray$Element]])
>>>> kryo.register(classOf[Array[Short]])
>>>>
>>>> nonetheless, the spark should take care of this itself.  I'll look into
>>>> it later today.
>>>>
>>>>
>>>> On Mon, Mar 2, 2015 at 2:55 PM, Arun Luthra <arun.lut...@gmail.com>
>>>> wrote:
>>>>
>>>>> I think this is a Java vs scala syntax issue. Will check.
>>>>>
>>>>> On Thu, Feb 26, 2015 at 8:17 PM, Arun Luthra <arun.lut...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Problem is noted here:
>>>>>> https://issues.apache.org/jira/browse/SPARK-5949
>>>>>>
>>>>>> I tried this as a workaround:
>>>>>>
>>>>>> import org.apache.spark.scheduler._
>>>>>> import org.roaringbitmap._
>>>>>>
>>>>>> ...
>>>>>>
>>>>>>
>>>>>> kryo.register(classOf[org.roaringbitmap.RoaringBitmap])
>>>>>>     kryo.register(classOf[org.roaringbitmap.RoaringArray])
>>>>>>     kryo.register(classOf[org.roaringbitmap.ArrayContainer])
>>>>>>
>>>>>> kryo.register(classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus])
>>>>>>     kryo.register(classOf[org.roaringbitmap.RoaringArray$Element])
>>>>>>     kryo.register(classOf[org.roaringbitmap.RoaringArray$Element[]])
>>>>>>     kryo.register(classOf[short[]])
>>>>>>
>>>>>>
>>>>>> in build file:
>>>>>>
>>>>>> libraryDependencies += "org.roaringbitmap" % "RoaringBitmap" %
>>>>>> "0.4.8"
>>>>>>
>>>>>>
>>>>>> This fails to compile:
>>>>>>
>>>>>> ...:53: identifier expected but ']' found.
>>>>>>
>>>>>> [error]
>>>>>> kryo.register(classOf[org.roaringbitmap.RoaringArray$Element[]])
>>>>>>
>>>>>> also:
>>>>>>
>>>>>> :54: identifier expected but ']' found.
>>>>>>
>>>>>> [error]     kryo.register(classOf[short[]])
>>>>>> also:
>>>>>>
>>>>>> :51: class HighlyCompressedMapStatus in package scheduler cannot be
>>>>>> accessed in package org.apache.spark.scheduler
>>>>>> [error]
>>>>>> kryo.register(classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus])
>>>>>>
>>>>>>
>>>>>> Suggestions?
>>>>>>
>>>>>> Arun
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to