Re: Protobuf schema provider row functions break on CamelCased field names

2021-08-23 Thread Chris Hinds
8:15, Reuven Lax mailto:re...@google.com>> wrote: Definitely happy to look at a PR. On Tue, Aug 10, 2021 at 10:11 AM Chris Hinds mailto:chris.hi...@bdi.ox.ac.uk>> wrote: I created an issue for this: https://issues.apache.org/jira/browse/BEAM-12736 I also took a stab at a fix. Would y

Re: Protobuf schema provider row functions break on CamelCased field names

2021-08-10 Thread Chris Hinds
I created an issue for this: https://issues.apache.org/jira/browse/BEAM-12736 I also took a stab at a fix. Would you accept a pull request? Or, I'd be happy to discuss. Cheers, Chris. On 9 Aug 2021, at 21:02, Chris Hinds mailto:chris.hi...@bdi.ox.ac.uk>> wrote: Haha, it probabl

Re: Protobuf schema provider row functions break on CamelCased field names

2021-08-09 Thread Chris Hinds
Aug 9, 2021 at 10:57 AM Chris Hinds mailto:chris.hi...@bdi.ox.ac.uk>> wrote: Hi, I get an IllegalArgumentException when I call a row function against a proto instance. SerializableFunction myRowFunction = new ProtoMessageSchema().toRowFunction(new TypeDescriptor() {}); MyDataModel.ProtoPayl

Protobuf schema provider row functions break on CamelCased field names

2021-08-09 Thread Chris Hinds
Hi, I get an IllegalArgumentException when I call a row function against a proto instance. SerializableFunction myRowFunction = new ProtoMessageSchema().toRowFunction(new TypeDescriptor() {}); MyDataModel.ProtoPayload payload = … Row row = (Row) myRowFunction.apply(payload); It looks like ther

Re: Building a Schema from a file

2021-06-23 Thread Chris Hinds
Jun 2021, at 07:53, Chris Hinds mailto:chris.hi...@bdi.ox.ac.uk>> wrote: Hey Matthew, I got into a pickle a while back on something similar. I had a pre-existing Proto defn which I wanted to use as a Beam Schema. This worked like complete magic until the sync. When I tried to write resu

Re: Building a Schema from a file

2021-06-22 Thread Chris Hinds
Hey Matthew, I got into a pickle a while back on something similar. I had a pre-existing Proto defn which I wanted to use as a Beam Schema. This worked like complete magic until the sync. When I tried to write results as Parquet or BigQuery I discovered that my proto schema provider was using t

ParquetIO.sink fails on embedded Flink runner in Java classic

2020-11-03 Thread Chris Hinds
Hey, Flink is super useful for Beam development, but I’m having trouble writing data to Parquet. Everything works fine on DirectRunner, DataflowRunner, and FlinkRunner against a local cluster (1.10.2). However, when I use FlinkRunner in embedded mode, only a subset of my data arrive on the file