Hi wangning: Maybe your problem is similar to this issue:https://issues.apache.org/jira/browse/KYLIN-4291 <http://issues.apache.org/jira/browse/KYLIN-4291>.
Before issue fixed, you can try reload metadata and resume the error job again. > 在 2020年1月7日,11:08,王宁 <[email protected]> 写道: > > 我使用的是2.6.3版本,当我并发refresh同一个cube的多天数据的时候经常会发生如下的异常,异常发生在 Build Dimension > Dictionary阶段,请问我该如何解决这个问题。 > > org.apache.kylin.engine.mr.exception.HadoopShellException: > org.apache.kylin.common.persistence.WriteConflictException: Overwriting > conflict > /dict/DM_TRADE.CARGO_MATCH_CARGO_TRANS_DI/DISTANCE_TYPE/1b73e1ff-0ec2-6062-937f-9cbd216dbd9d.dict, > expect old TS 1578364599349, but it is 1578364632561 > at > org.apache.kylin.storage.hbase.HBaseResourceStore.updateTimestampImpl(HBaseResourceStore.java:372) > at > org.apache.kylin.common.persistence.ResourceStore.lambda$updateTimestampWithRetry$4(ResourceStore.java:443) > at > org.apache.kylin.common.persistence.ExponentialBackoffRetry.doWithRetry(ExponentialBackoffRetry.java:52) > at > org.apache.kylin.common.persistence.ResourceStore.updateTimestampWithRetry(ResourceStore.java:442) > at > org.apache.kylin.common.persistence.ResourceStore.updateTimestampCheckPoint(ResourceStore.java:437) > at > org.apache.kylin.common.persistence.ResourceStore.updateTimestamp(ResourceStore.java:432) > at > org.apache.kylin.dict.DictionaryManager.updateExistingDictLastModifiedTime(DictionaryManager.java:197) > at > org.apache.kylin.dict.DictionaryManager.trySaveNewDict(DictionaryManager.java:157) > at > org.apache.kylin.dict.DictionaryManager.saveDictionary(DictionaryManager.java:339) > at > org.apache.kylin.cube.CubeManager$DictionaryAssist.saveDictionary(CubeManager.java:1145) > at org.apache.kylin.cube.CubeManager.saveDictionary(CubeManager.java:1107) > at > org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:100) > at > org.apache.kylin.cube.cli.DictionaryGeneratorCLI.processSegment(DictionaryGeneratorCLI.java:69) > at > org.apache.kylin.engine.mr.steps.CreateDictionaryJob.run(CreateDictionaryJob.java:73) > at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:93) > at > org.apache.kylin.engine.mr.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:63) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167) > at > org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167) > at > org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:748) > > result code:2 > at > org.apache.kylin.engine.mr.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:73) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167) > at > org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167) > at > org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:748)
