Hello Ravi,

When you wget this url

wget http://<api_server>:9091/schema?name=ed&store=parquet&
isMutated=true&table=ed&secbypass=testing'

Do you get avsc file?

Regards,

Jagat Singh

On Sat, 27 Jun 2020, 7:01 am ravi kanth, <ravi....@gmail.com> wrote:

> Just want to follow up on the below email.
>
> Thanks,
> Ravi
>
>
> On Thu, Jun 25, 2020 at 5:37 PM ravi kanth <ravi....@gmail.com> wrote:
>
>> Hi Community,
>>
>> Hive Version: 3.1.2
>>
>> We are working on building a Hive Avro table on a few Avro files. I am
>> able to succesfully create the table and query it when I have the Avro
>> schema definition(avsc) file on hdfs with no issues.
>>
>> However, when trying to load the same schema from a rest API(as mentioned
>> in
>> https://cwiki.apache.org/confluence/display/Hive/AvroSerDe#AvroSerDe-CreatingAvro-backedHivetables),
>> hive throws an exception and fails to create the table.
>>
>> *Sample table:*
>> CREATE EXTERNAL TABLE ed_avro_1
>> STORED AS AVRO
>> LOCATION '/tmp/sample/yyyymmdd=20200206'
>> TBLPROPERTIES ('avro.schema.literal'='http://
>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing');
>>
>> When launched hive in INFO mode below is the trace of the problem which
>> looks like Hive is interpreting the URL as a file name and throwing out a
>> FileNotFoundException.
>>
>> I have tried using avro.schema.literal instead of avro.schema.url,
>> however it turns out that hive is interpreting URL as a String and throwing
>> a jackson parsing error.
>>
>> Can anyone help look into this? Is this a bug in Hive-3.1.2? Any details
>> will be of great help.
>>
>> Thanks,
>> Ravi
>>
>>
>> StackTrace:
>>
>>> 2020-06-26T00:06:03,283 INFO [main] org.apache.hadoop.hive.conf.HiveConf
>>>> - Using the default value passed in for log id:
>>>> 646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>
>>> 2020-06-26T00:06:03,283 INFO [main]
>>>> org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to
>>>> 646da35b-84b0-43aa-9b68-5d668ebbfc36 main
>>>
>>> 2020-06-26T00:06:03,286  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Compiling
>>>> command(queryId=hdfs_20200626000603_0992e79f-6e1c-4383-be62-a6466c4c1cf2):
>>>> CREATE EXTERNAL TABLE ed_avro_1
>>>
>>> STORED AS AVRO
>>>
>>> LOCATION '/tmp/event_detail/yyyymmdd=20200206'
>>>
>>> TBLPROPERTIES ('avro.schema.url'='http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing')
>>>
>>> 2020-06-26T00:06:03,630  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
>>>
>>> 2020-06-26T00:06:03,638  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] parse.CalcitePlanner: Starting Semantic Analysis
>>>
>>> 2020-06-26T00:06:03,669  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController
>>>> for session context : HiveAuthzSessionContext
>>>> [sessionString=646da35b-84b0-43aa-9b68-5d668ebbfc36, clientType=HIVECLI]
>>>
>>> 2020-06-26T00:06:03,673 WARN [646da35b-84b0-43aa-9b68-5d668ebbfc36 main]
>>>> org.apache.hadoop.hive.ql.session.SessionState - METASTORE_FILTER_HOOK will
>>>> be ignored, since hive.security.authorization.manager is set to instance of
>>>> HiveAuthorizerFactory.
>>>
>>> 2020-06-26T00:06:03,673  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStoreClient: Mestastore configuration
>>>> metastore.filter.hook changed from
>>>> org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
>>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
>>>
>>> 2020-06-26T00:06:03,675  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
>>>
>>> 2020-06-26T00:06:03,675  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=Cleaning up
>>>> thread local RawStore...
>>>
>>> 2020-06-26T00:06:03,676  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
>>>
>>> 2020-06-26T00:06:03,676  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=Done
>>>> cleaning up thread local RawStore
>>>
>>> 2020-06-26T00:06:03,680  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: Opening raw store with implementation
>>>> class:org.apache.hadoop.hive.metastore.ObjectStore
>>>
>>> 2020-06-26T00:06:03,680  WARN [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to
>>>> unsupported value null . Setting it to value: ignored
>>>
>>> 2020-06-26T00:06:03,681  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: ObjectStore, initialize called
>>>
>>> 2020-06-26T00:06:03,691  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is 
>>>> MYSQL
>>>
>>> 2020-06-26T00:06:03,691  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: Initialized ObjectStore
>>>
>>> 2020-06-26T00:06:03,692  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient
>>>> proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>> ugi=hdfs (auth:SIMPLE) retries=1 delay=1 lifetime=0
>>>
>>> 2020-06-26T00:06:03,704  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] parse.CalcitePlanner: Creating table scratch.ed_avro_1 position=22
>>>
>>> 2020-06-26T00:06:03,719  WARN [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to
>>>> unsupported value null . Setting it to value: ignored
>>>
>>> 2020-06-26T00:06:03,719  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: ObjectStore, initialize called
>>>
>>> 2020-06-26T00:06:03,730  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is 
>>>> MYSQL
>>>
>>> 2020-06-26T00:06:03,730  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.ObjectStore: Initialized ObjectStore
>>>
>>> 2020-06-26T00:06:03,731  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient
>>>> proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>>>> ugi=hdfs (auth:SIMPLE) retries=1 delay=1 lifetime=0
>>>
>>> 2020-06-26T00:06:03,731  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: get_database: @hive#scratch
>>>
>>> 2020-06-26T00:06:03,731  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=get_database:
>>>> @hive#scratch
>>>
>>> 2020-06-26T00:06:03,754  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Semantic Analysis Completed (retrial = false)
>>>
>>> 2020-06-26T00:06:03,762  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null,
>>>> properties:null)
>>>
>>> 2020-06-26T00:06:03,763  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Completed compiling
>>>> command(queryId=hdfs_20200626000603_0992e79f-6e1c-4383-be62-a6466c4c1cf2);
>>>> Time taken: 0.477 seconds
>>>
>>> 2020-06-26T00:06:03,763  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] reexec.ReExecDriver: Execution #1 of query
>>>
>>> 2020-06-26T00:06:03,763  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
>>>
>>> 2020-06-26T00:06:03,763  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Executing
>>>> command(queryId=hdfs_20200626000603_0992e79f-6e1c-4383-be62-a6466c4c1cf2):
>>>> CREATE EXTERNAL TABLE ed_avro_1
>>>
>>> STORED AS AVRO
>>>
>>> LOCATION '/tmp/event_detail/yyyymmdd=20200206'
>>>
>>> TBLPROPERTIES ('avro.schema.url'='http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing')
>>>
>>> 2020-06-26T00:06:03,765  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
>>>
>>> 2020-06-26T00:06:03,765  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStoreClient: Mestastore configuration
>>>> metastore.filter.hook changed from
>>>> org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
>>>> to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
>>>
>>> 2020-06-26T00:06:03,765  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
>>>
>>> 2020-06-26T00:06:03,765  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=Cleaning up
>>>> thread local RawStore...
>>>
>>> 2020-06-26T00:06:03,765  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
>>>
>>> 2020-06-26T00:06:03,766  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=Done
>>>> cleaning up thread local RawStore
>>>
>>> 2020-06-26T00:06:03,790  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] avro.AvroSerDe: AvroSerde::initialize(): Preset value of
>>>> avro.schema.literal == null
>>>
>>> 2020-06-26T00:06:03,809  WARN [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] avro.AvroSerDe: Encountered AvroSerdeException determining schema.
>>>> Returning signal schema to indicate problem
>>>
>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeException: Unable to read
>>>> schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(AvroSerdeUtils.java:146)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.determineSchemaOrReturnErrorSchema(AvroSerDe.java:197)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.initialize(AvroSerDe.java:111)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.initialize(AvroSerDe.java:84)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:540)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:90)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4954)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[?:1.8.0_252]
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_252]
>>>
>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> Caused by: java.io.FileNotFoundException: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1896)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> org.apache.hadoop.fs.http.AbstractHttpFileSystem.open(AbstractHttpFileSystem.java:61)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.fs.http.HttpFileSystem.open(HttpFileSystem.java:23)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:899)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.getSchemaFromFS(AvroSerdeUtils.java:169)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(AvroSerdeUtils.java:139)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> ... 35 more
>>>
>>> 2020-06-26T00:06:03,871 ERROR [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] hive.log: error in initSerDe:
>>>> org.apache.hadoop.hive.serde2.SerDeException Encountered AvroSerdeException
>>>> determining schema. Returning signal schema to indicate problem: Unable to
>>>> read schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> org.apache.hadoop.hive.serde2.SerDeException: Encountered
>>>> AvroSerdeException determining schema. Returning signal schema to indicate
>>>> problem: Unable to read schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:543)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:90)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4954)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[?:1.8.0_252]
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_252]
>>>
>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> 2020-06-26T00:06:03,874 ERROR [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] metadata.Table: Unable to get field from serde:
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe
>>>
>>> java.lang.RuntimeException:
>>>> MetaException(message:org.apache.hadoop.hive.serde2.SerDeException
>>>> Encountered AvroSerdeException determining schema. Returning signal schema
>>>> to indicate problem: Unable to read schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing)
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:291)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:898)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4954)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[?:1.8.0_252]
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_252]
>>>
>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> Caused by: org.apache.hadoop.hive.metastore.api.MetaException:
>>>> org.apache.hadoop.hive.serde2.SerDeException Encountered AvroSerdeException
>>>> determining schema. Returning signal schema to indicate problem: Unable to
>>>> read schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:98)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> ... 28 more
>>>
>>> 2020-06-26T00:06:03,874  INFO [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] avro.AvroSerDe: AvroSerde::initialize(): Preset value of
>>>> avro.schema.literal == null
>>>
>>> 2020-06-26T00:06:03,878  WARN [646da35b-84b0-43aa-9b68-5d668ebbfc36
>>>> main] avro.AvroSerDe: Encountered AvroSerdeException determining schema.
>>>> Returning signal schema to indicate problem
>>>
>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeException: Unable to read
>>>> schema from given path: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(AvroSerdeUtils.java:146)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.determineSchemaOrReturnErrorSchema(AvroSerDe.java:197)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.initialize(AvroSerDe.java:111)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerDe.initialize(AvroSerDe.java:84)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:540)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:90)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:900)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:937)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4954)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
>>>> ~[hive-cli-3.1.2.jar:3.1.2]
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> ~[?:1.8.0_252]
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_252]
>>>
>>> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> Caused by: java.io.FileNotFoundException: http://
>>>> <api_server>:9091/schema?name=ed&store=parquet&isMutated=true&table=ed&secbypass=testing
>>>
>>> at
>>>> sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1896)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
>>>> ~[?:1.8.0_252]
>>>
>>> at
>>>> org.apache.hadoop.fs.http.AbstractHttpFileSystem.open(AbstractHttpFileSystem.java:61)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.fs.http.HttpFileSystem.open(HttpFileSystem.java:23)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:899)
>>>> ~[hadoop-common-3.1.2.jar:?]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.getSchemaFromFS(AvroSerdeUtils.java:169)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> at
>>>> org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(AvroSerdeUtils.java:139)
>>>> ~[hive-exec-3.1.2.jar:3.1.2]
>>>
>>> ... 33 more
>>>
>>> Thanks,
>> Ravi
>>
>

Reply via email to