Doesn't explain WHY it happened, but I was able to resolve it like this:

Optional<RecordField> _temp =
fieldValue.getParentRecord().get().getSchema().getField(fieldValue.getField().getFieldName());
RecordField _rf = _temp.get();
value = DataTypeUtils.convertType(value, _rf.getDataType(),
_rf.getFieldName());

On Sun, Jul 28, 2019 at 10:19 AM Mike Thomsen <mikerthom...@gmail.com>
wrote:

> I have a simple avro schema in a test case that looks like this:
>
> {
>     "type": "record",
>     "name": "PersonRecord",
>     "fields": [
>         { "name": "firstName", "type": "string" },
>         { "name": "lastName", "type": "string" },
>         { "name": "creationDateTime", "type": [ "null", "type": "long",
> "logicalType": "timestamp-millis" }]
>     ]
> }
>
> Then I try something like this...
>
> RecordPath path = recordPathCache.getCompiled("/creationDateTime");
> RecordPathResult rp = path.evaluate(targetRecord);
> Optional<FieldValue> nodeField = rp.getSelectedFields().findFirst();
>
> if (!nodeField.isPresent()) {
>     throw new ProcessException("...");
> }
>
> FieldValue fieldValue = nodeField.get();
> //fieldValue.getField() is a Choice of String, Record
>
> Is there a way to get the correct field type here? I assume that
> Choice[String, Record] default here was done to facilitate schema inference.
>
> Thanks,
>
> Mike
>

Reply via email to