[ 
https://issues.apache.org/jira/browse/ATLAS-346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hemanth Yamijala updated ATLAS-346:
-----------------------------------
    Attachment: hive_server2_import_bug_snip.txt

Attached relevant portion of failure in application.log.

It appears, on startup, the Hook consumer (which is responsible for reading 
from Kafka) is coming up and processing messages *before* the Atlas API service 
is bound. Since the Hook consumer uses the API to send messages to Atlas, it 
fails with a ConnectionRefused exception.

> Atlas server loses messages sent from Hive hook if restarted after unclean 
> shutdown
> -----------------------------------------------------------------------------------
>
>                 Key: ATLAS-346
>                 URL: https://issues.apache.org/jira/browse/ATLAS-346
>             Project: Atlas
>          Issue Type: Bug
>            Reporter: Hemanth Yamijala
>            Priority: Critical
>         Attachments: hive_server2_import_bug_snip.txt
>
>
> * Start Atlas server pointed to an external Kafka instance
> * Configure HiveServer2 with Atlas hook pointing to the same Kafka instance.
> * Run a hive DDL script like 
> {code}
> for i in `seq 1 25`; do ./bin/beeline -u jdbc:hive2://localhost:10000 -n 
> ${user} -p ${pass} -e "create table tbl${i} (col${i}1 int, col${i}2 
> string);"; done
> {code}
> * While the script is executing, kill -9 the Atlas server
> * Let the script complete.
> * Verify that all the events are added to the ATLAS_HOOK topic.
> * Verify if the tables are added to Atlas.
> The observation is that the Kafka topic has all relevant messages, but the 
> tables aren't added to Atlas.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to