Hi,

Nifi version 1.7

We have a dataflow that would get data from Oracle database and load into hive 
tables.

Data flow is something like below:
GenerateTableFetch -> ExecuteSQL > AvrotoJson/ORC (we tried both) > PutHDFS > 
ListHDFS> ReplaceTExt (to build load data query form the file) > PutHiveQL.

Data at source (ex: column "cpyKey" NUMBER)  in Number/INT format is being 
written as
{"type":"record","name":"NiFi_ExecuteSQL_Record","namespace":"any.data","fields":[{"name":"cpyKey","type":["null",{"type":"bytes","logicalType":"decimal","precision":10,"scale":0}]}

When this is inserted into hive table weather data is loaded from ORC 
(convertAvroToORC)  file or JSON (ConvertAvroToJSON) file, querying data from 
hive throws parsing exception with incompatible data types.


Error: java.io.IOException: java.lang.RuntimeException: ORC split generation 
failed with exception: java.lang.IllegalArgumentException: ORC does not support 
type conversion from file type binary (1) to reader type bigint (1) 
(state=,code=0)

Appreciate any help on this.

Thanks,
Ravi Papisetti

Reply via email to