>>>> I think these two ways are both OK for you to write streaming job,
>>>>> `transform` is a more general way for you to transform from one DStream to
>>>>> another if there’s no related DStream API (but have related RDD API). But
>>>>> using map
gt; I think these two ways are both OK for you to write streaming job,
>>>>> `transform` is a more general way for you to transform from one DStream to
>>>>> another if there’s no related DStream API (but have related RDD API). But
>>>>> using map maybe m
t;> I think these two ways are both OK for you to write streaming job,
>>>> `transform` is a more general way for you to transform from one DStream to
>>>> another if there’s no related DStream API (but have related RDD API). But
>>>> using map maybe more stra
>
>>>
>>> Thanks
>>>
>>> Jerry
>>>
>>>
>>>
>>> *From:* madhu phatak [mailto:phatak@gmail.com]
>>> *Sent:* Monday, March 16, 2015 4:32 PM
>>> *To:* user@spark.apache.org
>>> *Subject:* MappedStre
madhu phatak [mailto:phatak....@gmail.com]
>> *Sent:* Monday, March 16, 2015 4:32 PM
>> *To:* user@spark.apache.org
>> *Subject:* MappedStream vs Transform API
>>
>>
>>
>> Hi,
>>
>> Current implementation of map function in spark streaming looks
and easy to understand.
>
>
>
> Thanks
>
> Jerry
>
>
>
> *From:* madhu phatak [mailto:phatak@gmail.com]
> *Sent:* Monday, March 16, 2015 4:32 PM
> *To:* user@spark.apache.org
> *Subject:* MappedStream vs Transform API
>
>
>
> Hi,
>
>
From: madhu phatak [mailto:phatak@gmail.com]
Sent: Monday, March 16, 2015 4:32 PM
To: user@spark.apache.org
Subject: MappedStream vs Transform API
Hi,
Current implementation of map function in spark streaming looks as below.
def map[U: ClassTag](mapFunc: T => U): DStream[U] = {
Hi,
Current implementation of map function in spark streaming looks as below.
def map[U: ClassTag](mapFunc: T => U): DStream[U] = {
new MappedDStream(this, context.sparkContext.clean(mapFunc))
}
It creates an instance of MappedDStream which is a subclass of DStream.
The same function can