if you are willing to use kryo encoder you can do your original Dataset<
List<Map<String,Object>>>> i think

for example in scala i create here an intermediate Dataset[Any]:

scala> Seq(1,2,3).toDS.map(x => if (x % 2 == 0) x else
x.toString)(org.apache.spark.sql.Encoders.kryo[Any]).map{ (x: Any) => x
match { case i: Int => i.toString; case s: String => s }}.show
+-----+
|value|
+-----+
|    1|
|    2|
|    3|
+-----+




On Mon, Oct 9, 2017 at 2:38 PM, kant kodali <kanth...@gmail.com> wrote:

> Hi Koert,
>
> Thanks! If I have this  Dataset<Seq<Map<String, X>>> what would be the
> Enconding?is it Encoding.kryo(Seq.class) ?
>
> Also shouldn't List be supported? Should I create a ticket for this?
>
>
> On Mon, Oct 9, 2017 at 6:10 AM, Koert Kuipers <ko...@tresata.com> wrote:
>
>> it supports Dataset<Seq<Map<String, X>>> where X must be a supported type
>> also. Object is not a supported type.
>>
>> On Mon, Oct 9, 2017 at 7:36 AM, kant kodali <kanth...@gmail.com> wrote:
>>
>>> Hi All,
>>>
>>> I am wondering if spark supports Dataset<List<Map<String,Object>>> ?
>>>
>>> when I do the following it says no map function available?
>>>
>>> Dataset<List<Map<String,Object>>> resultDs = ds.map(lambda,
>>> Encoders.bean(List.class));
>>>
>>> Thanks!
>>>
>>>
>>>
>>
>

Reply via email to