Do you have the full code example?

I think this would be similar to the mapPartitions code flow, something
like flatMap( _ =>  _.toList )

I haven't yet tested this out but this is how I'd first try.

On Sat, 1 Dec 2018 at 01:02, James Starks <suse...@protonmail.com.invalid>
wrote:

> When processing data, I create an instance of RDD[Iterable[MyCaseClass]]
> and I want to convert it to RDD[MyCaseClass] so that it can be further
> converted to dataset or dataframe with toDS() function. But I encounter a
> problem that SparkContext can not be instantiated within SparkSession.map
> function because it already exists, even with allowMultipleContexts set to
> true.
>
>     val sc = new SparkConf()
>     sc.set("spark.driver.allowMultipleContexts", "true")
>     new SparkContext(sc).parallelize(seq)
>
> How can I fix this?
>
> Thanks.
>


-- 
Chris

Reply via email to