disassociated,
removing it.
Can anyone help me?
Thanks in advance
Tenghuan He
it clear.
On Fri, Apr 8, 2016 at 4:22 PM, Holden Karau <hol...@pigscanfly.ca> wrote:
> It seems like the union function on RDDs might be what you are looking
> for, or was there something else you were trying to achieve?
>
>
> On Thursday, April 7, 2016, Tenghuan He <ten
in your code, and it is required.
>
> On Thu, Apr 7, 2016 at 5:52 AM, Tenghuan He <tenghua...@gmail.com> wrote:
> > Hi all,
> >
> > I want to create an empty rdd and partition it
> >
> > val buffer: RDD[(K, (V, Int))] = base.context.emptyRDD[(K, (V,
> >
Hi all,
I want to create an empty rdd and partition it
val buffer: RDD[(K, (V, Int))] = base.context.emptyRDD[(K, (V,
Int))].partitionBy(new HashPartitioner(5))
but got Error: No ClassTag available for K
scala needs at runtime to have information about K , but how to solve this?
Thanks in
, Mar 28, 2016 at 11:01 AM, Tenghuan He <tenghua...@gmail.com> wrote:
> Thanks very much Ted
>
> I added MyRDD.scala to the spark source code and rebuilt the whole spark
> project, using myrdd.asInstanceOf[MyRDD] doesn't work. It seems that MyRDD
> is not exposed to the spark
Hi Wenchao,
I use steps described in the page and it works great, you can have a try:)
http://danielnee.com/2015/01/setting-up-intellij-for-spark/
On Mon, Mar 28, 2016 at 9:38 AM, 吴文超 wrote:
> for the simplest word count,
> val wordCounts = textFile.flatMap(line =>
gt;
> You can extend RDD and include your custom logic in the subclass.
>
> On Sun, Mar 27, 2016 at 10:14 AM, Tenghuan He <tenghua...@gmail.com>
> wrote:
>
>> Thanks Ted,
>>
>> but I have a doubt that as the code above (line 4) in the spark-shell
>>
D as the return type.
>> Or, you can cast myrdd as MyRDD in spark-shell.
>>
>> BTW I don't think it is good practice to add custom method to base RDD.
>>
>> On Sun, Mar 27, 2016 at 9:44 AM, Tenghuan He <tenghua...@gmail.com>
>> wrote:
>>
>>&g
y intended to
> declare MyRDD as the return type.
> Or, you can cast myrdd as MyRDD in spark-shell.
>
> BTW I don't think it is good practice to add custom method to base RDD.
>
> On Sun, Mar 27, 2016 at 9:44 AM, Tenghuan He <tenghua...@gmail.com> wrote:
>
>> Hi
ur MyRDD ?
>
> Thanks
>
> On Sun, Mar 27, 2016 at 9:22 AM, Tenghuan He <tenghua...@gmail.com> wrote:
>
>> Hi everyone,
>>
>> I am creating a custom RDD which extends RDD and add a custom method,
>> however the custom method cannot be found.
>>
Hi everyone,
I am creating a custom RDD which extends RDD and add a custom method,
however the custom method cannot be found.
The custom RDD looks like the following:
class MyRDD[K, V](
var base: RDD[(K, V)],
part: Partitioner
) extends RDD[(K, V)](base.context, Nil) {
def
to rebuild the whole spark project instead the spark-core
submodule to make the changes work?
Rebuiling the whole project is too time consuming, is there any better
choice?
Thanks & Best Regards
Tenghuan He
a
> schema: org.apache.spark.sql.types.StructType =
> StructType(StructField(A,StringType,true), StructField(B,StringType,true),
> StructField(C,StringType,true), StructField(num,IntegerType,false))
>
> scala> val rdd1 = rdd0.filter(r => !idList.contains(r(3)))
> rdd1: org.
13 matches
Mail list logo