Hello Kylin Developers,
I am trying to run sample cube given by kylin (
http://kylin.apache.org/docs15/tutorial/kylin_sample.html) . I am stuck on
the part which tells to build the cube.
My build cube fails on the first stage itself that is #1 Step Name: Create
Intermediate Flat Hive Table
Below is the output I am getting in log file
*OS command error exit with 2 -- hive -e "USE default;
DROP TABLE IF EXISTS
kylin_intermediate_kylin_sales_cube_desc_20120101000000_20160403000000;
CREATE EXTERNAL TABLE IF NOT EXISTS
kylin_intermediate_kylin_sales_cube_desc_20120101000000_20160403000000
(
DEFAULT_KYLIN_SALES_PART_DT date
,DEFAULT_KYLIN_SALES_LEAF_CATEG_ID bigint
,DEFAULT_KYLIN_SALES_LSTG_SITE_ID int
,DEFAULT_KYLIN_CATEGORY_GROUPINGS_META_CATEG_NAME string
,DEFAULT_KYLIN_CATEGORY_GROUPINGS_CATEG_LVL2_NAME string
,DEFAULT_KYLIN_CATEGORY_GROUPINGS_CATEG_LVL3_NAME string
,DEFAULT_KYLIN_SALES_LSTG_FORMAT_NAME string
,DEFAULT_KYLIN_SALES_PRICE decimal(19,4)
,DEFAULT_KYLIN_SALES_SELLER_ID bigint
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\177'
STORED AS SEQUENCEFILE
LOCATION
'/kylin/kylin_metadata/kylin-2f78b10c-cff6-4d2c-bef4-089b3831c2d2/kylin_intermediate_kylin_sales_cube_desc_20120101000000_20160403000000';
SET dfs.replication=2;
SET dfs.block.size=32000000;
SET hive.exec.compress.output=true;
SET hive.auto.convert.join.noconditionaltask=true;
SET hive.auto.convert.join.noconditionaltask.size=300000000;
SET
mapreduce.map.output.compress.codec=org.apache.hadoop.io.compress.SnappyCodec;
SET
mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.SnappyCodec;
SET hive.merge.mapfiles=true;
SET hive.merge.mapredfiles=true;
SET mapred.output.compression.type=BLOCK;
SET hive.merge.size.per.task=256000000;
SET hive.support.concurrency=false;
SET mapreduce.job.split.metainfo.maxsize=-1;
INSERT OVERWRITE TABLE
kylin_intermediate_kylin_sales_cube_desc_20120101000000_20160403000000
SELECT
KYLIN_SALES.PART_DT
,KYLIN_SALES.LEAF_CATEG_ID
,KYLIN_SALES.LSTG_SITE_ID
,KYLIN_CATEGORY_GROUPINGS.META_CATEG_NAME
,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL2_NAME
,KYLIN_CATEGORY_GROUPINGS.CATEG_LVL3_NAME
,KYLIN_SALES.LSTG_FORMAT_NAME
,KYLIN_SALES.PRICE
,KYLIN_SALES.SELLER_ID
FROM DEFAULT.KYLIN_SALES as KYLIN_SALES
INNER JOIN DEFAULT.KYLIN_CAL_DT as KYLIN_CAL_DT
ON KYLIN_SALES.PART_DT = KYLIN_CAL_DT.CAL_DT
INNER JOIN DEFAULT.KYLIN_CATEGORY_GROUPINGS as KYLIN_CATEGORY_GROUPINGS
ON KYLIN_SALES.LEAF_CATEG_ID = KYLIN_CATEGORY_GROUPINGS.LEAF_CATEG_ID
AND KYLIN_SALES.LSTG_SITE_ID = KYLIN_CATEGORY_GROUPINGS.SITE_ID
WHERE (KYLIN_SALES.PART_DT >= '2012-01-01' AND KYLIN_SALES.PART_DT <
'2016-04-03')
;
"
Logging initialized using configuration in
jar:file:/usr/lib/hive/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties
OK
Time taken: 1.183 seconds
OK
Time taken: 0.169 seconds
OK
Time taken: 0.773 seconds
Query ID = root_20160405115931_e7622f84-9f30-4a1e-acad-91e45effb2f5
Total jobs = 3
Execution log at:
/tmp/root/root_20160405115931_e7622f84-9f30-4a1e-acad-91e45effb2f5.log
2016-04-05 11:59:36 Starting to launch local task to process map
join; maximum memory = 477626368
2016-04-05 11:59:38 Dump the side-table for tag: 1 with group count:
144 into file:
file:/home/hduser/iotmp/5518ec6b-ebf5-4aa3-8984-dd6269959b30/hive_2016-04-05_11-59-31_741_4864253870930847560-1/-local-10004/HashTable-Stage-11/MapJoin-mapfile01--.hashtable
2016-04-05 11:59:39 Uploaded 1 File to:
file:/home/hduser/iotmp/5518ec6b-ebf5-4aa3-8984-dd6269959b30/hive_2016-04-05_11-59-31_741_4864253870930847560-1/-local-10004/HashTable-Stage-11/MapJoin-mapfile01--.hashtable
(10893 bytes)
2016-04-05 11:59:39 Dump the side-table for tag: 0 with group count:
731 into file:
file:/home/hduser/iotmp/5518ec6b-ebf5-4aa3-8984-dd6269959b30/hive_2016-04-05_11-59-31_741_4864253870930847560-1/-local-10004/HashTable-Stage-11/MapJoin-mapfile10--.hashtable
2016-04-05 11:59:39 Uploaded 1 File to:
file:/home/hduser/iotmp/5518ec6b-ebf5-4aa3-8984-dd6269959b30/hive_2016-04-05_11-59-31_741_4864253870930847560-1/-local-10004/HashTable-Stage-11/MapJoin-mapfile10--.hashtable
(271350 bytes)
2016-04-05 11:59:39 End of local task; Time Taken: 2.208 sec.
Execution completed successfully
MapredLocal task succeeded
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Job running in-process (local Hadoop)
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:675)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:644)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genAllOneUniqueJoinObject(CommonJoinOperator.java:676)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:754)
at
org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:414)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:644)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:657)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:660)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:756)
at
org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:414)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:122)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:97)
at
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:162)
at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:508)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 32 more
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1010)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:616)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 18 more
org.apache.hadoop.hive.ql.metadata.HiveException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:577)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1010)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:616)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
... 16 more
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 18 more
org.apache.hadoop.hive.ql.metadata.HiveException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:577)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1010)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:616)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
... 16 more
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 18 more
org.apache.hadoop.hive.ql.metadata.HiveException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:577)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1010)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:616)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
... 16 more
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 18 more
org.apache.hadoop.hive.ql.metadata.HiveException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:577)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1010)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:616)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:630)
at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: native snappy library not available:
SnappyCompressor has not been loaded.
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:249)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:622)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:566)
... 16 more
Caused by: java.lang.RuntimeException: native snappy library not
available: SnappyCompressor has not been loaded.
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1261)
at
org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1154)
at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.<init>(SequenceFile.java:1509)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:275)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:528)
at
org.apache.hadoop.hive.ql.exec.Utilities.createSequenceWriter(Utilities.java:1508)
at
org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat.getHiveRecordWriter(HiveSequenceFileOutputFormat.java:64)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:261)
at
org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:246)
... 18 more
2016-04-05 11:59:41,721 Stage-11 map = 0%, reduce = 0%
Ended Job = job_local1683457853_0001 with errors
Error during job, obtaining debugging information...
Job Tracking URL: http://localhost:8080/ <http://localhost:8080/>
FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-11: HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec*
--
Regards
*Yagyank chadha*
*Undergraduate student*
*Computer Science Engineering*
*Thapar University, Patiala*