Hi Sung,

private fields are only supported if you specify getters and setters accordingly. Otherwise you need to use `Row.class` and perform the mapping in a subsequent map() function manually via reflection.

Regards,
Timo


Am 29.04.19 um 15:44 schrieb Sung Gon Yi:
In https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/table/common.html#convert-a-table-into-a-datastream-or-dataset,
POJO data type is available to convert to DataStream.

I would like to use POJO data type class with private fields. I wonder it is possible or not officially.
Any currently it does not work.

Codes:
—————
CsvTableSource as = CsvTableSource.builder()
         .path("aa.csv")
         .field("name",STRING)
         .field("value",INT)
         .build();
Table aa = tEnv.fromTableSource(as);
tEnv.toAppendStream(aa, P.class);
—————
public class Pimplements Serializable {
     private Stringname;
     private Integervalue;
}
—————

Above codes, I got below error message:
==========
Exception in thread "main" org.apache.flink.table.api.TableException: Arity [2] of result [ArrayBuffer(String, Integer)] does not match the number[1] of requested type [GenericType<aa.P>]. at org.apache.flink.table.api.TableEnvironment.generateRowConverterFunction(TableEnvironment.scala:1165) at org.apache.flink.table.api.StreamTableEnvironment.getConversionMapper(StreamTableEnvironment.scala:423) at org.apache.flink.table.api.StreamTableEnvironment.translate(StreamTableEnvironment.scala:936) at org.apache.flink.table.api.StreamTableEnvironment.translate(StreamTableEnvironment.scala:866) at org.apache.flink.table.api.java.StreamTableEnvironment.toAppendStream(StreamTableEnvironment.scala:202) at org.apache.flink.table.api.java.StreamTableEnvironment.toAppendStream(StreamTableEnvironment.scala:156)
at ...
==========

When fields of class P are changed to “public”, it works well.
—————
public class Pimplements Serializable {
     public Stringname;
     public Integervalue;
}
—————

Thanks,
skonmeme




Reply via email to