[ 
https://issues.apache.org/jira/browse/SPARK-19751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan resolved SPARK-19751.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 2.2.0

Issue resolved by pull request 17188
[https://github.com/apache/spark/pull/17188]

> Create Data frame API fails with a self referencing bean
> --------------------------------------------------------
>
>                 Key: SPARK-19751
>                 URL: https://issues.apache.org/jira/browse/SPARK-19751
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Avinash Venkateshaiah
>            Priority: Minor
>             Fix For: 2.2.0
>
>
> createDataset API throws a stack overflow exception when we try creating a 
> Dataset using a bean encoder. The bean is self referencing
> BEAN:
> public class HierObj implements Serializable {
>     String name;
>     List<HierObj> children;
>     public String getName() {
>         return name;
>     }
>     public void setName(String name) {
>         this.name = name;
>     }
>     public List<HierObj> getChildren() {
>         return children;
>     }
>     public void setChildren(List<HierObj> children) {
>         this.children = children;
>     }
> }
> // create an object
>         HierObj hierObj = new HierObj();
>         hierObj.setName("parent");
>         List children = new ArrayList();
>         HierObj child1 = new HierObj();
>         child1.setName("child1");
>         HierObj child2 = new HierObj();
>         child2.setName("child2");
>         children.add(child1);
>         children.add(child2);
>         hierObj.setChildren(children);
> // create a dataset
>         Dataset ds = sparkSession().createDataset(Arrays.asList(hierObj), 
> Encoders.bean(HierObj.class));



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to