[ https://issues.apache.org/jira/browse/SPARK-15577?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15563910#comment-15563910 ]
Jakob Odersky edited comment on SPARK-15577 at 10/10/16 11:41 PM: ------------------------------------------------------------------ This was considered and trade-offs were actively discussed, but ultimately the type alias was chosen over sub classing. I think a principal argument in favor of aliasing was to avoid incompatibilities in future libraries, i.e. there is utility function was written to accept a {{DataFrame}}, however I want to pass in a {{Dataset\[Row\]}}. [This email thread| http://apache-spark-developers-list.1001551.n3.nabble.com/discuss-DataFrame-vs-Dataset-in-Spark-2-0-td16445.html] contains the whole discussion was (Author: jodersky): This was considered and trade-offs were actively discussed, but ultimately the type alias was chosen over sub classing. I think the main argument in favor of aliasing was to avoid incompatibilities in future libraries, i.e. there is utility function was written to accept a {{DataFrame}}, however I want to pass in a {{Dataset\[Row\]}}. [This email thread| http://apache-spark-developers-list.1001551.n3.nabble.com/discuss-DataFrame-vs-Dataset-in-Spark-2-0-td16445.html] contains the whole discussion > Java can't import DataFrame type alias > -------------------------------------- > > Key: SPARK-15577 > URL: https://issues.apache.org/jira/browse/SPARK-15577 > Project: Spark > Issue Type: Improvement > Components: Java API, SQL > Affects Versions: 2.0.0 > Reporter: holdenk > > After SPARK-13244, all Java code needs to be updated to use Dataset<Row> > instead of DataFrame as we used a type alias. Should we consider adding a > DataFrame to the Java API which just extends Dataset<Row> for compatibility? > cc [~liancheng] ? -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org