[ https://issues.apache.org/jira/browse/ATLAS-1121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15422817#comment-15422817 ]
Ayub Khan edited comment on ATLAS-1121 at 8/16/16 2:41 PM: ----------------------------------------------------------- Submitting the patch to handle NPE with respect to jackson-databind. [~madhan.neethiraj], [~shwethags], [~suma.shivaprasad] please review the patch I have tested the patch and the topology submission is successful and the entities are created over Atlas. You will still see the error but the topology metadata submission to Atlas succeeds.. {noformat} [root@vimal-erie-6-1 storm-topology]# storm jar hcube-storm-topology-0.0.1.jar org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties Running: /usr/jdk64/jdk1.8.0_60/bin/java -server -Ddaemon.name= -Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.0.0-1181/storm -Dstorm.log.dir=/var/log/storm -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/reflectasm-1.10.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ambari-metrics-storm-sink.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-core-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1181.jar org.apache.storm.daemon.ClientJarTransformerRunner org.apache.storm.hack.StormShadeTransformer hcube-storm-topology-0.0.1.jar /tmp/6a601f5663ba11e698cffa163ef6032d.jar Running: /usr/jdk64/jdk1.8.0_60/bin/java -client -Ddaemon.name= -Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.0.0-1181/storm -Dstorm.log.dir=/var/log/storm -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib:/usr/hdp/current/storm-client/lib -Dstorm.conf.file= -cp /grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/reflectasm-1.10.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ambari-metrics-storm-sink.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-core-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1181.jar:/tmp/6a601f5663ba11e698cffa163ef6032d.jar:/usr/hdp/current/storm-supervisor/conf:/grid/0/hdp/2.5.0.0-1181/storm/bin -Dstorm.jar=/tmp/6a601f5663ba11e698cffa163ef6032d.jar org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties 2339 [main] INFO o.a.s.StormSubmitter - Generated ZooKeeper secret payload for MD5-digest: -7910590019420719599:-5664763422779979058 2456 [main] INFO o.a.s.s.a.AuthUtils - Got AutoCreds [] 2543 [main] INFO o.a.s.StormSubmitter - Uploading topology jar /tmp/6a601f5663ba11e698cffa163ef6032d.jar to assigned location: /grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar Start uploading file '/tmp/6a601f5663ba11e698cffa163ef6032d.jar' to '/grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar' (161473843 bytes) [==================================================] 161473843 / 161473843 File '/tmp/6a601f5663ba11e698cffa163ef6032d.jar' uploaded to '/grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar' (161473843 bytes) 4418 [main] INFO o.a.s.StormSubmitter - Successfully uploaded topology jar to assigned location: /grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar 4419 [main] INFO o.a.s.StormSubmitter - Submitting topology hcube-log-streaming-phoenix in distributed mode with conf {"topology.workers":10,"storm.zookeeper.topology.auth.scheme":"digest","storm.zookeeper.topology.auth.payload":"-7910590019420719599:-5664763422779979058"} 8342 [main] INFO o.a.s.StormSubmitter - Finished submitting topology: hcube-log-streaming-phoenix 8349 [main] INFO o.a.s.StormSubmitter - Initializing the registered ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] 8924 [main] INFO o.a.a.ApplicationProperties - Looking for atlas-application.properties in classpath 8924 [main] INFO o.a.a.ApplicationProperties - Loading atlas-application.properties from file:/etc/storm/2.5.0.0-1181/0/atlas-application.properties log4j:WARN No appenders could be found for logger (org.apache.atlas.ApplicationProperties). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 10171 [main] INFO o.a.a.h.AtlasHook - Created Atlas Hook 10522 [main] INFO o.a.s.StormSubmitter - Invoking the registered ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] 10523 [main] INFO o.a.a.s.h.StormAtlasHook - Collecting metadata for a new storm topology: hcube-log-streaming-phoenix 10634 [main] ERROR o.a.a.s.h.StormAtlasHook - storm topology exception java.lang.NullPointerException at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:141) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:187) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) [storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:0.7.0.2.5.0.0-1181] at org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) [6a601f5663ba11e698cffa163ef6032d.jar:?] 10709 [main] ERROR o.a.a.s.h.StormAtlasHook - storm topology exception java.lang.NullPointerException at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:141) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createSpoutInstance(StormAtlasHook.java:310) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addSpouts(StormAtlasHook.java:294) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createTopologyGraph(StormAtlasHook.java:280) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:95) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) [storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:0.7.0.2.5.0.0-1181] at org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) [6a601f5663ba11e698cffa163ef6032d.jar:?] 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.bolt.JdbcInsertBolt@7a5b769b 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.mapper.SimpleJdbcMapper@f4c0e4e 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='host', val=null, sqlType=12} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='type', val=null, sqlType=12} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='seq_num', val=null, sqlType=-5} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='id', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='cluster', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='event_count', val=null, sqlType=-5} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='message_md5', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='event_md5', val=null, sqlType=12} 10723 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='ip', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='path', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='file', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='method', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logger_name', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logtime', val=null, sqlType=93} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='line_number', val=null, sqlType=-5} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logfile_line_number', val=null, sqlType=-5} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='level', val=null, sqlType=12} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='log_message', val=null, sqlType=12} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.common.HikariCPConnectionProvider@24361cfc 10737 [main] INFO o.a.a.h.AtlasHook - Adding entity for type: storm_topology 23946 [main] INFO o.a.k.c.p.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [vimal-erie-6-1.openstacklocal:6667] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 24084 [main] INFO o.a.k.c.p.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [vimal-erie-6-1.openstacklocal:6667] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-1 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 24089 [main] WARN o.a.k.c.p.ProducerConfig - The configuration key.deserializer = org.apache.kafka.common.serialization.StringDeserializer was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration value.deserializer = org.apache.kafka.common.serialization.StringDeserializer was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration hook.group.id = atlas was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration partition.assignment.strategy = roundrobin was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.connection.timeout.ms = 200 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.session.timeout.ms = 400 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.connect = vimal-erie-6-1.openstacklocal:2181 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.sync.time.ms = 20 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration auto.offset.reset = smallest was supplied but isn't a known config. 24093 [main] INFO o.a.k.c.u.AppInfoParser - Kafka version : 0.10.0.2.5.0.0-1181 24093 [main] INFO o.a.k.c.u.AppInfoParser - Kafka commitId : 022ed507ec080025 {noformat} Attaching Atlas UI snapshot showing the entities created as part of storm topology submission. was (Author: ayubkhan): Submitting the patch to handle NPE with respect to jackson-databind. [~madhan.neethiraj], [~shwethags], [~suma.shivaprasad] please review the patch I have tested the patch and the topology submission is successful and the entities are created over Atlas. You will still see the error but the topology metadata submission to Atlas succeeds.. {noformat} [root@vimal-erie-6-1 storm-topology]# storm jar hcube-storm-topology-0.0.1.jar org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties Running: /usr/jdk64/jdk1.8.0_60/bin/java -server -Ddaemon.name= -Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.0.0-1181/storm -Dstorm.log.dir=/var/log/storm -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/reflectasm-1.10.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ambari-metrics-storm-sink.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-core-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1181.jar org.apache.storm.daemon.ClientJarTransformerRunner org.apache.storm.hack.StormShadeTransformer hcube-storm-topology-0.0.1.jar /tmp/6a601f5663ba11e698cffa163ef6032d.jar Running: /usr/jdk64/jdk1.8.0_60/bin/java -client -Ddaemon.name= -Dstorm.options= -Dstorm.home=/grid/0/hdp/2.5.0.0-1181/storm -Dstorm.log.dir=/var/log/storm -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib:/usr/hdp/current/storm-client/lib -Dstorm.conf.file= -cp /grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-over-slf4j-1.6.6.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/zookeeper.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/servlet-api-2.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/reflectasm-1.10.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ambari-metrics-storm-sink.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/objenesis-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/clojure-1.7.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-slf4j-impl-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/disruptor-3.3.2.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-core-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/minlog-1.3.0.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/slf4j-api-1.7.7.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/kryo-3.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/asm-5.0.3.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/ring-cors-0.1.5.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/storm-core-1.0.1.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/lib/log4j-api-2.1.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:/grid/0/hdp/2.5.0.0-1181/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1181.jar:/tmp/6a601f5663ba11e698cffa163ef6032d.jar:/usr/hdp/current/storm-supervisor/conf:/grid/0/hdp/2.5.0.0-1181/storm/bin -Dstorm.jar=/tmp/6a601f5663ba11e698cffa163ef6032d.jar org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties 2339 [main] INFO o.a.s.StormSubmitter - Generated ZooKeeper secret payload for MD5-digest: -7910590019420719599:-5664763422779979058 2456 [main] INFO o.a.s.s.a.AuthUtils - Got AutoCreds [] 2543 [main] INFO o.a.s.StormSubmitter - Uploading topology jar /tmp/6a601f5663ba11e698cffa163ef6032d.jar to assigned location: /grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar Start uploading file '/tmp/6a601f5663ba11e698cffa163ef6032d.jar' to '/grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar' (161473843 bytes) [==================================================] 161473843 / 161473843 File '/tmp/6a601f5663ba11e698cffa163ef6032d.jar' uploaded to '/grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar' (161473843 bytes) 4418 [main] INFO o.a.s.StormSubmitter - Successfully uploaded topology jar to assigned location: /grid/0/hadoop/storm/nimbus/inbox/stormjar-36b9f10a-bd32-4368-bed8-75ae19104fd9.jar 4419 [main] INFO o.a.s.StormSubmitter - Submitting topology hcube-log-streaming-phoenix in distributed mode with conf {"topology.workers":10,"storm.zookeeper.topology.auth.scheme":"digest","storm.zookeeper.topology.auth.payload":"-7910590019420719599:-5664763422779979058"} 8342 [main] INFO o.a.s.StormSubmitter - Finished submitting topology: hcube-log-streaming-phoenix 8349 [main] INFO o.a.s.StormSubmitter - Initializing the registered ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] 8924 [main] INFO o.a.a.ApplicationProperties - Looking for atlas-application.properties in classpath 8924 [main] INFO o.a.a.ApplicationProperties - Loading atlas-application.properties from file:/etc/storm/2.5.0.0-1181/0/atlas-application.properties log4j:WARN No appenders could be found for logger (org.apache.atlas.ApplicationProperties). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 10171 [main] INFO o.a.a.h.AtlasHook - Created Atlas Hook 10522 [main] INFO o.a.s.StormSubmitter - Invoking the registered ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] 10523 [main] INFO o.a.a.s.h.StormAtlasHook - Collecting metadata for a new storm topology: hcube-log-streaming-phoenix 10634 [main] ERROR o.a.a.s.h.StormAtlasHook - storm topology exception java.lang.NullPointerException at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:141) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:187) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) [storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:0.7.0.2.5.0.0-1181] at org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) [6a601f5663ba11e698cffa163ef6032d.jar:?] 10709 [main] ERROR o.a.a.s.h.StormAtlasHook - storm topology exception java.lang.NullPointerException at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:141) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:198) ~[storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createSpoutInstance(StormAtlasHook.java:310) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.addSpouts(StormAtlasHook.java:294) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.createTopologyGraph(StormAtlasHook.java:280) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:95) [storm-bridge-0.7.0.2.5.0.0-1181.jar:0.8-incubating-SNAPSHOT] at org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) [storm-bridge-shim-0.7.0.2.5.0.0-1181.jar:0.7.0.2.5.0.0-1181] at org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) [storm-core-1.0.1.2.5.0.0-1181.jar:1.0.1.2.5.0.0-1181] at org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) [6a601f5663ba11e698cffa163ef6032d.jar:?] 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.bolt.JdbcInsertBolt@7a5b769b 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.mapper.SimpleJdbcMapper@f4c0e4e 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='host', val=null, sqlType=12} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='type', val=null, sqlType=12} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='seq_num', val=null, sqlType=-5} 10718 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='id', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='cluster', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='event_count', val=null, sqlType=-5} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='message_md5', val=null, sqlType=12} 10719 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='event_md5', val=null, sqlType=12} 10723 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='ip', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='path', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='file', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='method', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logger_name', val=null, sqlType=12} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logtime', val=null, sqlType=93} 10725 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='line_number', val=null, sqlType=-5} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='logfile_line_number', val=null, sqlType=-5} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='level', val=null, sqlType=12} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: Column{columnName='log_message', val=null, sqlType=12} 10732 [main] INFO o.a.a.s.h.StormTopologyUtil - Processing instance: org.apache.storm.jdbc.common.HikariCPConnectionProvider@24361cfc 10737 [main] INFO o.a.a.h.AtlasHook - Adding entity for type: storm_topology 23946 [main] INFO o.a.k.c.p.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [vimal-erie-6-1.openstacklocal:6667] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 24084 [main] INFO o.a.k.c.p.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [vimal-erie-6-1.openstacklocal:6667] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-1 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 24089 [main] WARN o.a.k.c.p.ProducerConfig - The configuration key.deserializer = org.apache.kafka.common.serialization.StringDeserializer was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration value.deserializer = org.apache.kafka.common.serialization.StringDeserializer was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration hook.group.id = atlas was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration partition.assignment.strategy = roundrobin was supplied but isn't a known config. 24090 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.connection.timeout.ms = 200 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.session.timeout.ms = 400 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.connect = vimal-erie-6-1.openstacklocal:2181 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration zookeeper.sync.time.ms = 20 was supplied but isn't a known config. 24091 [main] WARN o.a.k.c.p.ProducerConfig - The configuration auto.offset.reset = smallest was supplied but isn't a known config. 24093 [main] INFO o.a.k.c.u.AppInfoParser - Kafka version : 0.10.0.2.5.0.0-1181 24093 [main] INFO o.a.k.c.u.AppInfoParser - Kafka commitId : 022ed507ec080025 {noformat} Attaching Atlas UI snapshot showing the entities created as part of storm topology submission. !Atlas 2016-08-16 19-48-59.png|thumbnail! > NPE while submitting topology in StormHook > ------------------------------------------ > > Key: ATLAS-1121 > URL: https://issues.apache.org/jira/browse/ATLAS-1121 > Project: Atlas > Issue Type: Bug > Affects Versions: trunk > Reporter: Ayub Khan > Assignee: Ayub Khan > > NPE while submitting topology in StormHook > {code} > [storm@ctr-e25-1471039652053-0001-01-000009 erie]$ storm jar > hcube-storm-topology-0.0.1.jar org.hw.hcube.storm.LogStreamingToplogyV2 > phoenix.properties > Running: /usr/jdk64/jdk1.8.0_60/bin/java -server -Ddaemon.name= > -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.0.0-1201/storm > -Dstorm.log.dir=/var/log/storm > -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= > -cp > /usr/hdp/2.5.0.0-1201/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-core-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.0.0-1201/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.0.0-1201/storm/lib/zookeeper.jar:/usr/hdp/2.5.0.0-1201/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.0.0-1201/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1201.jar > org.apache.storm.daemon.ClientJarTransformerRunner > org.apache.storm.hack.StormShadeTransformer hcube-storm-topology-0.0.1.jar > /tmp/9e206b70636a11e685530242ac1b1bc0.jar > Running: /usr/jdk64/jdk1.8.0_60/bin/java -client -Ddaemon.name= > -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.0.0-1201/storm > -Dstorm.log.dir=/var/log/storm > -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib:/usr/hdp/current/storm-client/lib > -Dstorm.conf.file= -cp > /usr/hdp/2.5.0.0-1201/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-core-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.0.0-1201/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.0.0-1201/storm/lib/zookeeper.jar:/usr/hdp/2.5.0.0-1201/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.0.0-1201/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1201.jar:/tmp/9e206b70636a11e685530242ac1b1bc0.jar:/usr/hdp/current/storm-supervisor/conf:/usr/hdp/2.5.0.0-1201/storm/bin > -Dstorm.jar=/tmp/9e206b70636a11e685530242ac1b1bc0.jar > org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties > 1269 [main] INFO o.a.s.StormSubmitter - Generated ZooKeeper secret payload > for MD5-digest: -5580857394466738431:-5449170806113196196 > 1320 [main] INFO o.a.s.s.a.AuthUtils - Got AutoCreds [] > 1533 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1682 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1710 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1738 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1765 [main] INFO o.a.s.StormSubmitter - Uploading topology jar > /tmp/9e206b70636a11e685530242ac1b1bc0.jar to assigned location: > /hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar > Start uploading file '/tmp/9e206b70636a11e685530242ac1b1bc0.jar' to > '/hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar' > (161437864 bytes) > [==================================================] 161437864 / 161437864 > File '/tmp/9e206b70636a11e685530242ac1b1bc0.jar' uploaded to > '/hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar' > (161437864 bytes) > 5154 [main] INFO o.a.s.StormSubmitter - Successfully uploaded topology jar > to assigned location: > /hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar > 5155 [main] INFO o.a.s.StormSubmitter - Submitting topology > hcube-log-streaming-phoenix in distributed mode with conf > {"topology.workers":10,"storm.zookeeper.topology.auth.scheme":"digest","storm.zookeeper.topology.auth.payload":"-5580857394466738431:-5449170806113196196"} > 5164 [main] INFO o.a.s.m.n.Login - successfully logged in. > 5194 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6269 [main] INFO o.a.s.StormSubmitter - Finished submitting topology: > hcube-log-streaming-phoenix > 6274 [main] INFO o.a.s.StormSubmitter - Initializing the registered > ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] > 6366 [main] INFO o.a.a.ApplicationProperties - Looking for > atlas-application.properties in classpath > 6366 [main] INFO o.a.a.ApplicationProperties - Loading > atlas-application.properties from > file:/etc/storm/2.5.0.0-1201/0/atlas-application.properties > log4j:WARN No appenders could be found for logger > (org.apache.atlas.ApplicationProperties). > log4j:WARN Please initialize the log4j system properly. > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more > info. > 6736 [main] INFO o.a.a.h.AtlasHook - Created Atlas Hook > 6741 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6771 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6828 [main] INFO o.a.s.StormSubmitter - Invoking the registered > ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] > 6829 [main] INFO o.a.a.s.h.StormAtlasHook - Collecting metadata for a new > storm topology: hcube-log-streaming-phoenix > 6859 [main] WARN o.a.s.StormSubmitter - Error occurred in invoking submitter > hook:[org.apache.atlas.storm.hook.StormAtlasHook] > java.lang.RuntimeException: Atlas hook is unable to process the topology. > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:105) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) > [9e206b70636a11e685530242ac1b1bc0.jar:?] > Caused by: java.lang.NullPointerException > at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] > at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] > at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] > at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:139) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:182) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > ... 7 more > Exception in thread "main" org.apache.storm.hooks.SubmitterHookException: > java.lang.RuntimeException: Atlas hook is unable to process the topology. > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:289) > at > org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) > at > org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) > at > org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) > Caused by: java.lang.RuntimeException: Atlas hook is unable to process the > topology. > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:105) > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) > ... 5 more > Caused by: java.lang.NullPointerException > at java.text.DateFormat.hashCode(DateFormat.java:739) > at java.util.HashMap.hash(HashMap.java:338) > at java.util.HashMap.put(HashMap.java:611) > at java.util.HashSet.add(HashSet.java:219) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:139) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:182) > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) > ... 7 more > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)