Perfect. The API in Java is bit clumsy though

What I ended up doing in Java (the val is from lombok, if anyone's
wondering):
        val attributes =
JavaConversions.asJavaCollection(schema.toAttributes()).stream().map(Attribute::toAttribute).collect(Collectors.toList());
        val encoder =
RowEncoder.apply(schema).resolveAndBind(ScalaUtils.scalaSeq(attributes),
SimpleAnalyzer$.MODULE$);


-------
Regards,
Andy

On Thu, Jan 5, 2017 at 2:53 AM, Liang-Chi Hsieh <vii...@gmail.com> wrote:

>
> You need to resolve and bind the encoder.
>
> ExpressionEncoder<Row> enconder = RowEncoder.apply(struct).
> resolveAndBind();
>
>
> Andy Dang wrote
> > Hi all,
> > (cc-ing dev since I've hit some developer API corner)
> >
> > What's the best way to convert an InternalRow to a Row if I've got an
> > InternalRow and the corresponding Schema.
> >
> > Code snippet:
> >     @Test
> >     public void foo() throws Exception {
> >         Row row = RowFactory.create(1);
> >         StructType struct = new StructType().add("id",
> > DataTypes.IntegerType);
> >         ExpressionEncoder
> > <Row>
> >  enconder = RowEncoder.apply(struct);
> >         InternalRow internalRow = enconder.toRow(row);
> >         System.out.println("Internal row size: " +
> > internalRow.numFields());
> >         Row roundTrip = enconder.fromRow(internalRow);
> >         System.out.println("Round trip: " + roundTrip.size());
> >     }
> >
> > The code fails at the line encoder.fromRow() with the exception:
> >> Caused by: java.lang.UnsupportedOperationException: Cannot evaluate
> > expression: getcolumnbyordinal(0, IntegerType)
> >
> > -------
> > Regards,
> > Andy
>
>
>
>
>
> -----
> Liang-Chi Hsieh | @viirya
> Spark Technology Center
> http://www.spark.tc/
> --
> View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/Converting-an-InternalRow-to-a-Row-
> tp20460p20465.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to