Tried the following.
dataset.map(new MapFunction<String, List<Map<String, Object>>>() { @Override public List<Map<String, Object>> call(String input) throws Exception { List<Map<String, Object>> temp = new ArrayList<>(); temp.add(new HashMap<String, Object>()); return temp; } }, Encoders.kryo(List.class)); This doesn't even compile. error: no suitable method found for map(<anonymous MapFunction<String,List<Map<String,Object>>>>,Encoder<List>) limDS.map(new MapFunction<String, List<Map<String, Object>>>() { ^ method Dataset.<U#1>map(Function1<String,U#1>,Encoder<U#1>) is not applicable (cannot infer type-variable(s) U#1 (argument mismatch; <anonymous MapFunction<String,List<Map<String,Object>>>> cannot be converted to Function1<String,U#1>)) method Dataset.<U#2>map(MapFunction<String,U#2>,Encoder<U#2>) is not applicable (inferred type does not conform to equality constraint(s) inferred: List equality constraints(s): List,List<Map<String,Object>>) where U#1,T,U#2 are type-variables: U#1 extends Object declared in method <U#1>map(Function1<T,U#1>,Encoder<U#1>) T extends Object declared in class Dataset U#2 extends Object declared in method <U#2>map(MapFunction<T,U#2>,Encoder<U#2>) On Mon, Oct 9, 2017 at 11:48 AM, Koert Kuipers <ko...@tresata.com> wrote: > if you are willing to use kryo encoder you can do your original Dataset< > List<Map<String,Object>>>> i think > > for example in scala i create here an intermediate Dataset[Any]: > > scala> Seq(1,2,3).toDS.map(x => if (x % 2 == 0) x else > x.toString)(org.apache.spark.sql.Encoders.kryo[Any]).map{ (x: Any) => x > match { case i: Int => i.toString; case s: String => s }}.show > +-----+ > |value| > +-----+ > | 1| > | 2| > | 3| > +-----+ > > > > > On Mon, Oct 9, 2017 at 2:38 PM, kant kodali <kanth...@gmail.com> wrote: > >> Hi Koert, >> >> Thanks! If I have this Dataset<Seq<Map<String, X>>> what would be the >> Enconding?is it Encoding.kryo(Seq.class) ? >> >> Also shouldn't List be supported? Should I create a ticket for this? >> >> >> On Mon, Oct 9, 2017 at 6:10 AM, Koert Kuipers <ko...@tresata.com> wrote: >> >>> it supports Dataset<Seq<Map<String, X>>> where X must be a supported >>> type also. Object is not a supported type. >>> >>> On Mon, Oct 9, 2017 at 7:36 AM, kant kodali <kanth...@gmail.com> wrote: >>> >>>> Hi All, >>>> >>>> I am wondering if spark supports Dataset<List<Map<String,Object>>> ? >>>> >>>> when I do the following it says no map function available? >>>> >>>> Dataset<List<Map<String,Object>>> resultDs = ds.map(lambda, >>>> Encoders.bean(List.class)); >>>> >>>> Thanks! >>>> >>>> >>>> >>> >> >