utputDir)
>>>
>>> Method1 needs spark context to query cassandra. But I see below error
>>>
>>> java.io.NotSerializableException: org.apache.spark.SparkContext
>>>
>>> Is there a way we can fix this ?
>>>
>>> Thanks
>>>
n fix this ?
>>
>> Thanks
>>
>> --------------
>> If you reply to this email, your message will be added to the discussion
>> below:
>>
>> http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map-funciton-TASk-not-serili
ption: org.apache.spark.SparkContext
>
> Is there a way we can fix this ?
>
> Thanks
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map
fix this ?
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map-funciton-TASk-not-serilizable-error-tp25998.html
Sent from the Apache Spark User List mailing list archive at Nabble.
s there a way we can fix this ?
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map-funciton-TASk-not-serilizable-error-tp25998.html
> Se
>>
>> Is there a way we can fix this ?
>>
>> Thanks
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map-funciton-TASk-not-serilizable-error-tp25998.html
>>
row =>method1(row,sc)).saveAsTextFile(outputDir)
>>>
>>> Method1 needs spark context to query cassandra. But I see below error
>>>
>>> java.io.NotSerializableException: org.apache.spark.SparkContext
>>>
>>> Is there a way we can fix this ?
>>>
>
gt;>> reRDD.map(row =>method1(row,sc)).saveAsTextFile(outputDir)
>>>>
>>>> Method1 needs spark context to query cassandra. But I see below error
>>>>
>>>> java.io.NotSerializableException: org.apache.spark.SparkContext
>>>>
in map function
>>>>>
>>>>> reRDD.map(row =>method1(row,sc)).saveAsTextFile(outputDir)
>>>>>
>>>>> Method1 needs spark context to query cassandra. But I see below error
>>&g
Mon, Jan 18, 2016 at 12:29 PM, gpatcham <gpatc...@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I have a use case where I need to pass sparkcontext in map function
>>>>>>
>>>>>> reRDD.map(row
10 matches
Mail list logo