Unable to get results of intermediate dataset
Hi Team, I am new to Spark, my requirement is I have a huge list, which is converted to spark dataset and I need to operate on this dataset and store computed values in another object/dataset and store in memory for further processing. Approach I tried is : list is retrieved from third party in a loop. I converted this list to dataset and using function I am trying to iterate and store results in another dataset. Problem I am facing : I am not able to see any data in newly computed dataset. Kindly help me to sort out this issue, please let me know if any better approach. Sample Code: Class Person implements Serializable{ private static final long serialVersionUID = 1L; private String name; Private PersonId id; //getters and setters } Class personId { private int deptId; //getters and setters } Class PersonDetails implements Serializable{ private static final long serialVersionUID = 1L; private int deptId; private BigDecimal sal; private String name; //getters and setters } In another Class - I have below template code List personDtlsList = new ArrayList<>(); final Encoder encoder = Encoders.bean(Person.class); final Encoder< PersonDetails > personDtlsEncoder = Encoders.bean( PersonDetails .class); // here I try to hit thrid party Interface and get person information in list List personList = getPersonInformation( passing few parameters); Dataset personDS = sqlContext.createDataset(personList,encoder); Dataset personDtlsDS = sqlContext.createDataset( personDtlsList,personDtlsEncoder); JavaRDD personDtlsRDD = personDS.toDF().toJavaRDD().map(new Function() { private static final long serialVersionUID = 2L; @Override public PersonDetails call(Row row) throws Exception{ PersonDetails personDetails = new PersonDetails(); //setter for personDetails - name, sal and others personDetails.setName(row.getString(0)); personDetails.setSal(new BigDecimal(1)); personDtlsDS.union(sqlContext.createDataset(new ArrayList(){{add(personDetails);}}, personDtlsEncoder)); return personDetails; } }); personDtlsDS.count(); Regards, Sunitha.
Re: Help Required on Spark - Convert DataFrame to List with out using collect
Hi, Thank You All.. Here is my requirement, I have a dataframe which contains list of rows retrieved from oracle table. I need to iterate dataframe and fetch each record and call a common function by passing few parameters. Issue I am facing is : I am not able to call common function JavaRDD personRDD = person_dataframe.toJavaRDD().map(new Function<Row, Person>() { @Override public Person call(Row row) throws Exception{ Person person = new Person(); person.setId(row.getDecimal(0).longValue()); person.setName(row.getString(1)); personLst.add(person); return person; } }); personRDD.foreach(new VoidFunction() { private static final long serialVersionUID = 1123456L; @Override public void call(Person person) throws Exception { System.out.println(person.getId()); Here I tried to call common function } }); I am able to print data in foreach loop, however if I tried to call common function it gives me below error Error Message : org.apache.spark.SparkException: Task not serializable I kindly request you to share some idea(sample code / link to refer) on how to call a common function/Interace method by passing values in each record of the dataframe. Regards, Sunitha On Tue, Dec 19, 2017 at 1:20 PM, Weichen Xu <weichen...@databricks.com> wrote: > Hi Sunitha, > > In the mapper function, you cannot update outer variables such as > `personLst.add(person)`, > this won't work so that's the reason you got an empty list. > > You can use `rdd.collect()` to get a local list of `Person` objects > first, then you can safely iterate on the local list and do any update you > want. > > Thanks. > > On Tue, Dec 19, 2017 at 2:16 PM, Sunitha Chennareddy < > chennareddysuni...@gmail.com> wrote: > >> Hi Deepak, >> >> I am able to map row to person class, issue is I want to to call another >> method. >> I tried converting to list and its not working with out using collect. >> >> Regards >> Sunitha >> On Tuesday, December 19, 2017, Deepak Sharma <deepakmc...@gmail.com> >> wrote: >> >>> I am not sure about java but in scala it would be something like >>> df.rdd.map{ x => MyClass(x.getString(0),.)} >>> >>> HTH >>> >>> --Deepak >>> >>> On Dec 19, 2017 09:25, "Sunitha Chennareddy" <chennareddysunitha@.com >>> <chennareddysuni...@gmail.com>> wrote: >>> >>> Hi All, >>> >>> I am new to Spark, I want to convert DataFrame to List with >>> out using collect(). >>> >>> Main requirement is I need to iterate through the rows of dataframe and >>> call another function by passing column value of each row (person.getId()) >>> >>> Here is the snippet I have tried, Kindly help me to resolve the issue, >>> personLst is returning 0: >>> >>> List personLst= new ArrayList(); >>> JavaRDD personRDD = person_dataframe.toJavaRDD().map(new >>> Function<Row, Person>() { >>> public Person call(Row row) throws Exception{ >>> Person person = new Person(); >>> person.setId(row.getDecimal(0).longValue()); >>> person.setName(row.getString(1)); >>> >>> personLst.add(person); >>> // here I tried to call another function but control never passed >>> return person; >>> } >>> }); >>> logger.info("personLst size =="+personLst.size()); >>> logger.info("personRDD count ==="+personRDD.count()); >>> >>> //output is >>> personLst size == 0 >>> personRDD count === 3 >>> >>> >>> >
Re: Help Required on Spark - Convert DataFrame to List with out using collect
Hi Jorn, In my case I have to call common interface function by passing the values of each rdd. So I have tried iterating , but I was not able to trigger common function from call method as commented in the snippet code in my earlier mail. Request you please share your views. Regards Sunitha On Tuesday, December 19, 2017, Jörn Franke <jornfra...@gmail.com> wrote: > This is correct behavior. If you need to call another method simply append > another map, flatmap or whatever you need. > > Depending on your use case you may use also reduce and reduce by key. > However you never (!) should use a global variable as in your snippet. > This can to work because you work in a distributed setting. > Probably the code will fail on a cluster or at random. > > On 19. Dec 2017, at 07:16, Sunitha Chennareddy < > chennareddysuni...@gmail.com> wrote: > > Hi Deepak, > > I am able to map row to person class, issue is I want to to call another > method. > I tried converting to list and its not working with out using collect. > > Regards > Sunitha > On Tuesday, December 19, 2017, Deepak Sharma <deepakmc...@gmail.com> > wrote: > >> I am not sure about java but in scala it would be something like >> df.rdd.map{ x => MyClass(x.getString(0),.)} >> >> HTH >> >> --Deepak >> >> On Dec 19, 2017 09:25, "Sunitha Chennareddy" <chennareddysunitha@.com >> <chennareddysuni...@gmail.com>> wrote: >> >> Hi All, >> >> I am new to Spark, I want to convert DataFrame to List with >> out using collect(). >> >> Main requirement is I need to iterate through the rows of dataframe and >> call another function by passing column value of each row (person.getId()) >> >> Here is the snippet I have tried, Kindly help me to resolve the issue, >> personLst is returning 0: >> >> List personLst= new ArrayList(); >> JavaRDD personRDD = person_dataframe.toJavaRDD().map(new >> Function<Row, Person>() { >> public Person call(Row row) throws Exception{ >> Person person = new Person(); >> person.setId(row.getDecimal(0).longValue()); >> person.setName(row.getString(1)); >> >> personLst.add(person); >> // here I tried to call another function but control never passed >> return person; >> } >> }); >> logger.info("personLst size =="+personLst.size()); >> logger.info("personRDD count ==="+personRDD.count()); >> >> //output is >> personLst size == 0 >> personRDD count === 3 >> >> >>
Re: Help Required on Spark - Convert DataFrame to List with out using collect
Hi Deepak, I am able to map row to person class, issue is I want to to call another method. I tried converting to list and its not working with out using collect. Regards Sunitha On Tuesday, December 19, 2017, Deepak Sharma <deepakmc...@gmail.com> wrote: > I am not sure about java but in scala it would be something like > df.rdd.map{ x => MyClass(x.getString(0),.)} > > HTH > > --Deepak > > On Dec 19, 2017 09:25, "Sunitha Chennareddy" <chennareddysunitha@.com > <chennareddysuni...@gmail.com>> wrote: > > Hi All, > > I am new to Spark, I want to convert DataFrame to List with out > using collect(). > > Main requirement is I need to iterate through the rows of dataframe and > call another function by passing column value of each row (person.getId()) > > Here is the snippet I have tried, Kindly help me to resolve the issue, > personLst is returning 0: > > List personLst= new ArrayList(); > JavaRDD personRDD = person_dataframe.toJavaRDD().map(new > Function<Row, Person>() { > public Person call(Row row) throws Exception{ > Person person = new Person(); > person.setId(row.getDecimal(0).longValue()); > person.setName(row.getString(1)); > > personLst.add(person); > // here I tried to call another function but control never passed > return person; > } > }); > logger.info("personLst size =="+personLst.size()); > logger.info("personRDD count ==="+personRDD.count()); > > //output is > personLst size == 0 > personRDD count === 3 > > >
Help Required on Spark - Convert DataFrame to List with out using collect
Hi All, I am new to Spark, I want to convert DataFrame to List with out using collect(). Main requirement is I need to iterate through the rows of dataframe and call another function by passing column value of each row (person.getId()) Here is the snippet I have tried, Kindly help me to resolve the issue, personLst is returning 0: List personLst= new ArrayList(); JavaRDD personRDD = person_dataframe.toJavaRDD().map(new Function() { public Person call(Row row) throws Exception{ Person person = new Person(); person.setId(row.getDecimal(0).longValue()); person.setName(row.getString(1)); personLst.add(person); // here I tried to call another function but control never passed return person; } }); logger.info("personLst size =="+personLst.size()); logger.info("personRDD count ==="+personRDD.count()); //output is personLst size == 0 personRDD count === 3