[ https://issues.apache.org/jira/browse/ATLAS-1121?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ayub Khan updated ATLAS-1121: ----------------------------- Attachment: ATLAS-1121-1.patch > NPE while submitting topology in StormHook > ------------------------------------------ > > Key: ATLAS-1121 > URL: https://issues.apache.org/jira/browse/ATLAS-1121 > Project: Atlas > Issue Type: Bug > Affects Versions: trunk > Reporter: Ayub Khan > Assignee: Ayub Khan > Attachments: ATLAS-1121-1.patch, ATLAS-1121.patch, Atlas 2016-08-16 > 19-48-59.png > > > NPE while submitting topology in StormHook > {code} > [storm@ctr-e25-1471039652053-0001-01-000009 erie]$ storm jar > hcube-storm-topology-0.0.1.jar org.hw.hcube.storm.LogStreamingToplogyV2 > phoenix.properties > Running: /usr/jdk64/jdk1.8.0_60/bin/java -server -Ddaemon.name= > -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.0.0-1201/storm > -Dstorm.log.dir=/var/log/storm > -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= > -cp > /usr/hdp/2.5.0.0-1201/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-core-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.0.0-1201/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.0.0-1201/storm/lib/zookeeper.jar:/usr/hdp/2.5.0.0-1201/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.0.0-1201/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1201.jar > org.apache.storm.daemon.ClientJarTransformerRunner > org.apache.storm.hack.StormShadeTransformer hcube-storm-topology-0.0.1.jar > /tmp/9e206b70636a11e685530242ac1b1bc0.jar > Running: /usr/jdk64/jdk1.8.0_60/bin/java -client -Ddaemon.name= > -Dstorm.options= -Dstorm.home=/usr/hdp/2.5.0.0-1201/storm > -Dstorm.log.dir=/var/log/storm > -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib:/usr/hdp/current/storm-client/lib > -Dstorm.conf.file= -cp > /usr/hdp/2.5.0.0-1201/storm/lib/log4j-slf4j-impl-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/clojure-1.7.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/servlet-api-2.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-api-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/objenesis-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/minlog-1.3.0.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-core-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-over-slf4j-1.6.6.jar:/usr/hdp/2.5.0.0-1201/storm/lib/disruptor-3.3.2.jar:/usr/hdp/2.5.0.0-1201/storm/lib/zookeeper.jar:/usr/hdp/2.5.0.0-1201/storm/lib/slf4j-api-1.7.7.jar:/usr/hdp/2.5.0.0-1201/storm/lib/asm-5.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/reflectasm-1.10.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/log4j-core-2.1.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ring-cors-0.1.5.jar:/usr/hdp/2.5.0.0-1201/storm/lib/kryo-3.0.3.jar:/usr/hdp/2.5.0.0-1201/storm/lib/storm-rename-hack-1.0.1.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/lib/ambari-metrics-storm-sink.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:/usr/hdp/2.5.0.0-1201/storm/extlib/atlas-plugin-classloader-0.7.0.2.5.0.0-1201.jar:/tmp/9e206b70636a11e685530242ac1b1bc0.jar:/usr/hdp/current/storm-supervisor/conf:/usr/hdp/2.5.0.0-1201/storm/bin > -Dstorm.jar=/tmp/9e206b70636a11e685530242ac1b1bc0.jar > org.hw.hcube.storm.LogStreamingToplogyV2 phoenix.properties > 1269 [main] INFO o.a.s.StormSubmitter - Generated ZooKeeper secret payload > for MD5-digest: -5580857394466738431:-5449170806113196196 > 1320 [main] INFO o.a.s.s.a.AuthUtils - Got AutoCreds [] > 1533 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1682 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1710 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1738 [main] INFO o.a.s.m.n.Login - successfully logged in. > 1765 [main] INFO o.a.s.StormSubmitter - Uploading topology jar > /tmp/9e206b70636a11e685530242ac1b1bc0.jar to assigned location: > /hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar > Start uploading file '/tmp/9e206b70636a11e685530242ac1b1bc0.jar' to > '/hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar' > (161437864 bytes) > [==================================================] 161437864 / 161437864 > File '/tmp/9e206b70636a11e685530242ac1b1bc0.jar' uploaded to > '/hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar' > (161437864 bytes) > 5154 [main] INFO o.a.s.StormSubmitter - Successfully uploaded topology jar > to assigned location: > /hadoop/storm/nimbus/inbox/stormjar-2d8edcfa-000b-41d3-a170-9982b45867c0.jar > 5155 [main] INFO o.a.s.StormSubmitter - Submitting topology > hcube-log-streaming-phoenix in distributed mode with conf > {"topology.workers":10,"storm.zookeeper.topology.auth.scheme":"digest","storm.zookeeper.topology.auth.payload":"-5580857394466738431:-5449170806113196196"} > 5164 [main] INFO o.a.s.m.n.Login - successfully logged in. > 5194 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6269 [main] INFO o.a.s.StormSubmitter - Finished submitting topology: > hcube-log-streaming-phoenix > 6274 [main] INFO o.a.s.StormSubmitter - Initializing the registered > ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] > 6366 [main] INFO o.a.a.ApplicationProperties - Looking for > atlas-application.properties in classpath > 6366 [main] INFO o.a.a.ApplicationProperties - Loading > atlas-application.properties from > file:/etc/storm/2.5.0.0-1201/0/atlas-application.properties > log4j:WARN No appenders could be found for logger > (org.apache.atlas.ApplicationProperties). > log4j:WARN Please initialize the log4j system properly. > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more > info. > 6736 [main] INFO o.a.a.h.AtlasHook - Created Atlas Hook > 6741 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6771 [main] INFO o.a.s.m.n.Login - successfully logged in. > 6828 [main] INFO o.a.s.StormSubmitter - Invoking the registered > ISubmitterHook [org.apache.atlas.storm.hook.StormAtlasHook] > 6829 [main] INFO o.a.a.s.h.StormAtlasHook - Collecting metadata for a new > storm topology: hcube-log-streaming-phoenix > 6859 [main] WARN o.a.s.StormSubmitter - Error occurred in invoking submitter > hook:[org.apache.atlas.storm.hook.StormAtlasHook] > java.lang.RuntimeException: Atlas hook is unable to process the topology. > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:105) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) > [storm-core-1.0.1.2.5.0.0-1201.jar:1.0.1.2.5.0.0-1201] > at > org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) > [9e206b70636a11e685530242ac1b1bc0.jar:?] > Caused by: java.lang.NullPointerException > at java.text.DateFormat.hashCode(DateFormat.java:739) ~[?:1.8.0_60] > at java.util.HashMap.hash(HashMap.java:338) ~[?:1.8.0_60] > at java.util.HashMap.put(HashMap.java:611) ~[?:1.8.0_60] > at java.util.HashSet.add(HashSet.java:219) ~[?:1.8.0_60] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:139) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > ~[?:?] > at > org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:182) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) > ~[storm-bridge-shim-0.7.0.2.5.0.0-1201.jar:0.7.0.2.5.0.0-1201] > ... 7 more > Exception in thread "main" org.apache.storm.hooks.SubmitterHookException: > java.lang.RuntimeException: Atlas hook is unable to process the topology. > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:289) > at > org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:258) > at > org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:311) > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:347) > at > org.apache.storm.StormSubmitter.submitTopologyWithProgressBar(StormSubmitter.java:328) > at > org.hw.hcube.storm.LogStreamingToplogyV2.main(LogStreamingToplogyV2.java:164) > Caused by: java.lang.RuntimeException: Atlas hook is unable to process the > topology. > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:105) > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:57) > at > org.apache.storm.StormSubmitter.invokeSubmitterHook(StormSubmitter.java:285) > ... 5 more > Caused by: java.lang.NullPointerException > at java.text.DateFormat.hashCode(DateFormat.java:739) > at java.util.HashMap.hash(HashMap.java:338) > at java.util.HashMap.put(HashMap.java:611) > at java.util.HashSet.add(HashSet.java:219) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:139) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormTopologyUtil.getFieldValues(StormTopologyUtil.java:197) > at > org.apache.atlas.storm.hook.StormAtlasHook.createDataSet(StormAtlasHook.java:182) > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyInputs(StormAtlasHook.java:149) > at > org.apache.atlas.storm.hook.StormAtlasHook.addTopologyDataSets(StormAtlasHook.java:132) > at > org.apache.atlas.storm.hook.StormAtlasHook.notify(StormAtlasHook.java:89) > ... 7 more > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)