[ 
https://issues.apache.org/jira/browse/KYLIN-3894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yanwen Lin updated KYLIN-3894:
------------------------------
    Description: 
Hi all,
 I am currently running integration test. However, I met the following error. 
Could you please share some suggestions on this?
 I've passed maven install(skip test) and maven test.
  
 *1. Command*:
 mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox

 
 *2. Error message from Yarn Container Attempt:*

 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper: 
Accepting Mapper Key with ordinal: 12019-03-18 16:43:25,583 INFO [main] 
org.apache.kylin.engine.mr.KylinMapper: Do map, available memory: 
322m2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig: 
Creating new manager instance of class 
org.apache.kylin.cube.cuboid.CuboidManager2019-03-18 16:43:25,599 INFO [main] 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output 
Committer Algorithm version is 12019-03-18 16:43:25,599 INFO [main] 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: FileOutputCommitter 
skip cleanup _temporary folders under output directory:false, ignore cleanup 
failures: false2019-03-18 16:43:25,795 ERROR [main] 
org.apache.kylin.engine.mr.KylinMapper: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
 at 
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) at 
org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
 at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
 at 
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
 at 
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:422) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)2019-03-18 
16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper: Do cleanup, 
available memory: 318m2019-03-18 16:43:25,813 INFO [main] 
org.apache.kylin.engine.mr.KylinMapper: Total rows: 12019-03-18 16:43:25,813 
ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : 
java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
 at 
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) at 
org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
 at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
 at 
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
 at 
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:422) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) 2019-03-18 
16:43:25,926 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Stopping MapTask metrics system...2019-03-18 16:43:25,927 INFO [main] 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
stopped. 2019-03-18 16:43:25,927 INFO [main] 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
shutdown complete.


 *3. What I have tried (but not work):*
 Have made sure the following files have the following property: 
 3.1 core-site.xml
 File: \{kylin_root}/examples/test_case_data/sandbox/core-site.xml 
 File: HDP HDFS core-site.xml (via Ambari Web UI)
{code:java}
<property>
 <name>io.compression.codecs</name>
 
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
 
</property>{code}
{color:#000000}3.2 {color}mapred-site.xml 
 File: \{kylin_root}/examples/test_case_data/sandbox/mapred-site.xml 

File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
{code:java}
<property>
  <name>mapreduce.map.output.compress</name>
  <value>true</value>
</property>
<property>
  <name>mapred.map.output.compress.codec</name>  
  <value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
  <name>mapreduce.admin.user.env</name> 
  <value>LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native</value>
</property>{code}
{color:#000000}3.3 {color}libsnappy.so
 {color:#000000}Have checked the file {color}libsnappy.so is located at the 
{noformat}
/usr/hdp/3.0.1.0-187/hadoop/lib/native{noformat}
Thanks!

  was:
Hi all,
I am currently running integration test. However, I met the following error. 
Could you please share some suggestions on this?
I've passed maven install(skip test) and maven test.
 
*1. Command*:
mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox

 
*2. Error message from Yarn Container Attempt:*
{noformat}
2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper: 
Accepting Mapper Key with ordinal: 12019-03-18 16:43:25,583 INFO [main] 
org.apache.kylin.engine.mr.KylinMapper: Do map, available memory: 
322m2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig: 
Creating new manager instance of class 
org.apache.kylin.cube.cuboid.CuboidManager2019-03-18 16:43:25,599 INFO [main] 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output 
Committer Algorithm version is 12019-03-18 16:43:25,599 INFO [main] 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: FileOutputCommitter 
skip cleanup _temporary folders under output directory:false, ignore cleanup 
failures: false2019-03-18 16:43:25,795 ERROR [main] 
org.apache.kylin.engine.mr.KylinMapper:
java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
 at 
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) at 
org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
 at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
 at 
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
 at 
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:422) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)2019-03-18 
16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper: Do cleanup, 
available memory: 318m2019-03-18 16:43:25,813 INFO [main] 
org.apache.kylin.engine.mr.KylinMapper: Total rows: 12019-03-18 16:43:25,813 
ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : 
java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
 at 
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) at 
org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
 at 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
 at 
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
 at 
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
 at 
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
java.security.AccessController.doPrivileged(Native Method) at 
javax.security.auth.Subject.doAs(Subject.java:422) at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) 2019-03-18 
16:43:25,926 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
Stopping MapTask metrics system...2019-03-18 16:43:25,927 INFO [main] 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
stopped. 2019-03-18 16:43:25,927 INFO [main] 
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
shutdown complete.{noformat}
 
*3. What I have tried (but not work):*
Have made sure the following files have the following property: 
3.1 core-site.xml
File: \{kylin_root}/examples/test_case_data/sandbox/core-site.xml 
File: HDP HDFS core-site.xml (via Ambari Web UI)
{code:java}
<property>
 <name>io.compression.codecs</name>
 
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
 
</property>{code}
{color:#000000}3.2 {color}mapred-site.xml 
File: \{kylin_root}/examples/test_case_data/sandbox/mapred-site.xml 

File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
{code:java}
<property>
  <name>mapreduce.map.output.compress</name>
  <value>true</value>
</property>
<property>
  <name>mapred.map.output.compress.codec</name>  
  <value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
  <name>mapreduce.admin.user.env</name> 
  <value>LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native</value>
</property>{code}
{color:#000000}3.3 {color}libsnappy.so
{color:#000000}Have checked the file {color}libsnappy.so is located at the 
{noformat}
/usr/hdp/3.0.1.0-187/hadoop/lib/native{noformat}
Thanks!


> Build buildSupportsSnappy Error When Doing Integration Testing
> --------------------------------------------------------------
>
>                 Key: KYLIN-3894
>                 URL: https://issues.apache.org/jira/browse/KYLIN-3894
>             Project: Kylin
>          Issue Type: Test
>          Components: Tools, Build and Test
>    Affects Versions: v2.6.0
>         Environment: Hortonworks HDP 3.0.1.0-187 Docker container.
>            Reporter: Yanwen Lin
>            Priority: Major
>              Labels: test
>
> Hi all,
>  I am currently running integration test. However, I met the following error. 
> Could you please share some suggestions on this?
>  I've passed maven install(skip test) and maven test.
>   
>  *1. Command*:
>  mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
>  
>  *2. Error message from Yarn Container Attempt:*
>  2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper: 
> Accepting Mapper Key with ordinal: 12019-03-18 16:43:25,583 INFO [main] 
> org.apache.kylin.engine.mr.KylinMapper: Do map, available memory: 
> 322m2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig: 
> Creating new manager instance of class 
> org.apache.kylin.cube.cuboid.CuboidManager2019-03-18 16:43:25,599 INFO [main] 
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output 
> Committer Algorithm version is 12019-03-18 16:43:25,599 INFO [main] 
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: 
> FileOutputCommitter skip cleanup _temporary folders under output 
> directory:false, ignore cleanup failures: false2019-03-18 16:43:25,795 ERROR 
> [main] org.apache.kylin.engine.mr.KylinMapper: 
> java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>  at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
>  at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
> at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) 
> at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
> org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
> org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
> org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>  at 
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>  at 
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>  at 
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>  at 
> org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
>  at 
> org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
> org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
> org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)2019-03-18 
> 16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper: Do cleanup, 
> available memory: 318m2019-03-18 16:43:25,813 INFO [main] 
> org.apache.kylin.engine.mr.KylinMapper: Total rows: 12019-03-18 16:43:25,813 
> ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : 
> java.lang.UnsatisfiedLinkError: 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method) at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>  at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
>  at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150) 
> at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168) 
> at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304) at 
> org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192) at 
> org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1552)
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289) at 
> org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542) at 
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>  at 
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>  at 
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>  at 
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>  at 
> org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85)
>  at 
> org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44)
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77) at 
> org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at 
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at 
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at 
> org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) 2019-03-18 
> 16:43:25,926 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: 
> Stopping MapTask metrics system...2019-03-18 16:43:25,927 INFO [main] 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
> stopped. 2019-03-18 16:43:25,927 INFO [main] 
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system 
> shutdown complete.
>  *3. What I have tried (but not work):*
>  Have made sure the following files have the following property: 
>  3.1 core-site.xml
>  File: \{kylin_root}/examples/test_case_data/sandbox/core-site.xml 
>  File: HDP HDFS core-site.xml (via Ambari Web UI)
> {code:java}
> <property>
>  <name>io.compression.codecs</name>
>  
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>  
> </property>{code}
> {color:#000000}3.2 {color}mapred-site.xml 
>  File: \{kylin_root}/examples/test_case_data/sandbox/mapred-site.xml 
> File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
> {code:java}
> <property>
>   <name>mapreduce.map.output.compress</name>
>   <value>true</value>
> </property>
> <property>
>   <name>mapred.map.output.compress.codec</name>  
>   <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> </property>
> <property>
>   <name>mapreduce.admin.user.env</name> 
>   <value>LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native</value>
> </property>{code}
> {color:#000000}3.3 {color}libsnappy.so
>  {color:#000000}Have checked the file {color}libsnappy.so is located at the 
> {noformat}
> /usr/hdp/3.0.1.0-187/hadoop/lib/native{noformat}
> Thanks!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to