Hello,  we writing topologies what working on HDFS not HA , but when we running 
on Hadoop cluster with HA HDFS, We get follow ERROR:
java.lang.IllegalArgumentException: java.net.UnknownHostException: sorm3-dev at 
org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:411)
 at 
org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311)
 at 
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176) at 
org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:688) at 
org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:629) at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:159)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at 
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at 
org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at 
org.apache.hadoop.hive.ql.io.orc.OrcRecordUpdater.<init>(OrcRecordUpdater.java:234)
 at 
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat.getRecordUpdater(OrcOutputFormat.java:289)
 at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.createRecordUpdater(AbstractRecordWriter.java:253)
 at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.createRecordUpdaters(AbstractRecordWriter.java:245)
 at 
org.apache.hive.hcatalog.streaming.AbstractRecordWriter.newBatch(AbstractRecordWriter.java:189)
 at 
org.apache.hive.hcatalog.streaming.StrictJsonWriter.newBatch(StrictJsonWriter.java:41)
 at 
org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.<init>(HiveEndPoint.java:607)
 at 
org.apache.hive.hcatalog.streaming.HiveEndPoint$TransactionBatchImpl.<init>(HiveEndPoint.java:555)
 at 
org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.fetchTransactionBatchImpl(HiveEndPoint.java:441)
 at 
org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.fetchTransactionBatch(HiveEndPoint.java:421)
 at 
ru.mts.sorm.storm.hive.common.HiveWriter.lambda$nextTxnBatch$5(HiveWriter.java:250)
 at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) Caused by: 
java.net.UnknownHostException: sorm3-dev

But if we put hdfs-site.xml and core-site.xml into topologies jar it's works 
fine!!!
How we can read hdfs-site.xml and core-site.xml from FS, not including in jar.

Storm versions: 1.0.1

Best regards,
I. Shataev

Reply via email to