[ https://issues.apache.org/jira/browse/SPARK-5426?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kuldeep updated SPARK-5426: --------------------------- Description: DataFrame previously SchemaRDD is not directly java compatible. But (was: Here is a sample junit test for JavaApplySchemaSuite.java {code:title=JavaApplySchemaSuite.java|borderStyle=solid} @Test public void schemaRDDOperations() { List<Person> personList = new ArrayList<Person>(2); Person person1 = new Person(); person1.setName("Michael"); person1.setAge(29); personList.add(person1); Person person2 = new Person(); person2.setName("Yin"); person2.setAge(28); personList.add(person2); JavaRDD<Row> rowRDD = javaCtx.parallelize(personList).map( new Function<Person, Row>() { public Row call(Person person) throws Exception { return RowFactory.create(person.getName(), person.getAge()); } }); List<StructField> fields = new ArrayList<StructField>(2); fields.add(DataTypes.createStructField("name", DataTypes.StringType, false)); fields.add(DataTypes.createStructField("age", DataTypes.IntegerType, false)); StructType schema = DataTypes.createStructType(fields); SchemaRDD schemaRDD = javaSqlCtx.applySchema(rowRDD.rdd(), schema); schemaRDD.registerTempTable("people"); Row[] actual = javaSqlCtx.sql("SELECT * FROM people").map(new Function<Row, String>() { public String call(Row row) { return row.getString(0)+"_"+row.getString(1); } }).collect(); List<Row> expected = new ArrayList<Row>(2); expected.add(RowFactory.create("Michael_29")); expected.add(RowFactory.create("Yin_28")); Assert.assertEquals(expected, Arrays.asList(actual)); } {code}) > SQL Java API helper methods > --------------------------- > > Key: SPARK-5426 > URL: https://issues.apache.org/jira/browse/SPARK-5426 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.3.0 > Reporter: Kuldeep > Priority: Minor > > DataFrame previously SchemaRDD is not directly java compatible. But -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org