[GitHub] carbondata issue #1061: [CARBONDATA-1193] ViewFS Support - improvement

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1061
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4332/



---


[GitHub] carbondata issue #2058: [CARBONDATA-2249] Fixed bug for querying data throug...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2058
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3112/



---


[GitHub] carbondata issue #1010: [CARBONDATA-1110] put if clause out of the for claus...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1010
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4333/



---


[GitHub] carbondata issue #807: [CARBONDATA-942] off heap sort chunk size should be v...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/807
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4335/



---


[GitHub] carbondata issue #296: [WIP-CARBONDATA-382]Like Filter Query Optimization fo...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/296
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4336/



---


[GitHub] carbondata issue #2056: [CARBONDATA-2238][DataLoad] Merge and spill in-memor...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2056
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3113/



---


[GitHub] carbondata issue #71: [CARBONDATA-155] Code refactor to avoid the Type Casti...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/71
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4337/



---


[GitHub] carbondata issue #2055: [CARBONDATA-2224][File Level Reader Support] Externa...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2055
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3114/



---


[GitHub] carbondata issue #2052: [CARBONDATA-2246][DataLoad] Fix exhausted memory pro...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2052
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3115/



---


[GitHub] carbondata issue #985: [CARBONDATA-1090] added integration test cases for al...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/985
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4334/



---


[GitHub] carbondata issue #2066: [CARBONDATA-2257] Added SDV test cases for Partition...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2066
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4338/



---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3116/



---


[GitHub] carbondata issue #1857: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/1857
  
retest  this please


---


[GitHub] carbondata issue #1713: [CARBONDATA-1899] Optimize CarbonData concurrency te...

2018-03-14 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/1713
  
@jackylk Please review it.


---


[GitHub] carbondata issue #1856: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/1856
  
retest sdv please


---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3117/



---


[GitHub] carbondata issue #2064: [CARBONDATA-2255] Rename the streaming examples

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2064
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4340/



---


[GitHub] carbondata issue #2042: [CARBONDATA-2236]added sdv test cases for standard p...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2042
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3118/



---


[GitHub] carbondata issue #2065: [CARBONDATA-2256] Adding sdv Testcases for SET_Param...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2065
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4339/



---


[GitHub] carbondata issue #1856: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1856
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3887/



---


[GitHub] carbondata issue #2031: [CARBONDATA-2223] Remove unused listeners

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2031
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3119/



---


[GitHub] carbondata issue #2063: [CARBONDATA-2251] Refactored sdv testcase

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2063
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4341/



---


[GitHub] carbondata issue #2021: [CARBONDATA-2221] Throw exception when dropTable fai...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2021
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3120/



---


[GitHub] carbondata issue #1856: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/1856
  
@kunal642 Please check


---


[GitHub] carbondata issue #2062: [CARBONDATA-2254] Optimize CarbonData documentation

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2062
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4342/



---


[GitHub] carbondata issue #2020: [CARBONDATA-2220] Reduce unnecessary audit log

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2020
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3121/



---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
retest sdv please


---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
retest this please


---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
retest this please


---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
retest sdv please


---


[GitHub] carbondata issue #2061: [CARBONDATA-2253][SDK] Support write JSON/Avro data ...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2061
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4343/



---


[GitHub] carbondata issue #1990: [CARBONDATA-2195] Add new test case for partition fe...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1990
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3122/



---


[GitHub] carbondata issue #1990: [CARBONDATA-2195] Add new test case for partition fe...

2018-03-14 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/1990
  
@jackylk CI pass, please check


---


[GitHub] carbondata pull request #2055: [CARBONDATA-2224][File Level Reader Support] ...

2018-03-14 Thread ajantha-bhat
Github user ajantha-bhat commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2055#discussion_r174667420
  
--- Diff: 
hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
 ---
@@ -0,0 +1,678 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.hadoop.api;
+
+import java.io.ByteArrayInputStream;
+import java.io.DataInputStream;
+import java.io.IOException;
+import java.io.Serializable;
+import java.lang.reflect.Constructor;
+import java.util.ArrayList;
+import java.util.BitSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.carbondata.core.constants.CarbonCommonConstants;
+import org.apache.carbondata.core.datamap.DataMapChooser;
+import org.apache.carbondata.core.datamap.DataMapLevel;
+import org.apache.carbondata.core.datamap.Segment;
+import org.apache.carbondata.core.datamap.dev.expr.DataMapExprWrapper;
+import org.apache.carbondata.core.datastore.impl.FileFactory;
+import org.apache.carbondata.core.exception.InvalidConfigurationException;
+import org.apache.carbondata.core.indexstore.ExtendedBlocklet;
+import org.apache.carbondata.core.indexstore.PartitionSpec;
+import 
org.apache.carbondata.core.indexstore.blockletindex.BlockletDataMapFactory;
+import 
org.apache.carbondata.core.indexstore.blockletindex.SegmentIndexFileStore;
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier;
+import org.apache.carbondata.core.metadata.ColumnarFormatVersion;
+import org.apache.carbondata.core.metadata.schema.PartitionInfo;
+import org.apache.carbondata.core.metadata.schema.partition.PartitionType;
+import org.apache.carbondata.core.metadata.schema.table.CarbonTable;
+import org.apache.carbondata.core.metadata.schema.table.TableInfo;
+import org.apache.carbondata.core.mutate.UpdateVO;
+import org.apache.carbondata.core.scan.expression.Expression;
+import org.apache.carbondata.core.scan.filter.SingleTableProvider;
+import org.apache.carbondata.core.scan.filter.TableProvider;
+import org.apache.carbondata.core.scan.filter.resolver.FilterResolverIntf;
+import org.apache.carbondata.core.scan.model.QueryModel;
+import org.apache.carbondata.core.stats.QueryStatistic;
+import org.apache.carbondata.core.stats.QueryStatisticsConstants;
+import org.apache.carbondata.core.stats.QueryStatisticsRecorder;
+import org.apache.carbondata.core.statusmanager.SegmentUpdateStatusManager;
+import org.apache.carbondata.core.util.CarbonProperties;
+import org.apache.carbondata.core.util.CarbonTimeStatisticsFactory;
+import org.apache.carbondata.core.util.CarbonUtil;
+import org.apache.carbondata.core.util.DataTypeConverter;
+import org.apache.carbondata.core.util.DataTypeConverterImpl;
+import org.apache.carbondata.core.util.path.CarbonTablePath;
+import org.apache.carbondata.hadoop.CarbonInputSplit;
+import org.apache.carbondata.hadoop.CarbonMultiBlockSplit;
+import org.apache.carbondata.hadoop.CarbonProjection;
+import org.apache.carbondata.hadoop.CarbonRecordReader;
+import org.apache.carbondata.hadoop.readsupport.CarbonReadSupport;
+import 
org.apache.carbondata.hadoop.readsupport.impl.DictionaryDecodeReadSupport;
+import org.apache.carbondata.hadoop.util.CarbonInputFormatUtil;
+import org.apache.carbondata.hadoop.util.ObjectSerializationUtil;
+import org.apache.carbondata.hadoop.util.SchemaReader;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileSystem;
+import org.apache.hadoop.fs.LocalFileSystem;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapred.Reporter;
+import org.apache.hadoop.mapreduce.InputSplit;
+import org.apache.hadoop.mapreduce.JobContext;
+import org.apache.hadoop.mapreduce.RecordReader;
+impo

[GitHub] carbondata issue #2055: [CARBONDATA-2224][File Level Reader Support] Externa...

2018-03-14 Thread ajantha-bhat
Github user ajantha-bhat commented on the issue:

https://github.com/apache/carbondata/pull/2055
  
@jackylk : All the review comments have been addressed and again rebased 
with master. Please check the below commit.

https://github.com/apache/carbondata/pull/2055/commits/b530adf2071113a75bc8e982ea3bd934e971b650

InferSchema, just renaming done. cannot make it to table path level now as 
it dependent on identifier. 


---


[GitHub] carbondata issue #2055: [CARBONDATA-2224][File Level Reader Support] Externa...

2018-03-14 Thread ajantha-bhat
Github user ajantha-bhat commented on the issue:

https://github.com/apache/carbondata/pull/2055
  
retest this please... cannot see spark 2.1 report.


---


[GitHub] carbondata issue #1930: [CARBONDATA-2130] Find some spelling error in Carbon...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1930
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3123/



---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3888/



---


[GitHub] carbondata issue #2059: [CARBONDATA-2250][DataLoad] Reduce massive object ge...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2059
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4345/



---


[GitHub] carbondata issue #1929: [CARBONDATA-2129][CARBONDATA-2094][CARBONDATA-1516] ...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1929
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3124/



---


[GitHub] carbondata issue #2060: [CARBONDATA-2252] Query performance slows down as th...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2060
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4344/



---


[GitHub] carbondata issue #1920: [CARBONDATA-2110] Remove tempCsv option in test case...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1920
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3125/



---


[GitHub] carbondata issue #2058: [CARBONDATA-2249] Fixed bug for querying data throug...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2058
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4346/



---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3889/



---


[GitHub] carbondata issue #1857: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1857
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3126/



---


[GitHub] carbondata issue #2031: [CARBONDATA-2223] Remove unused listeners

2018-03-14 Thread dhatchayani
Github user dhatchayani commented on the issue:

https://github.com/apache/carbondata/pull/2031
  
retest this please


---


[GitHub] carbondata issue #2056: [CARBONDATA-2238][DataLoad] Merge and spill in-memor...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2056
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4347/



---


[GitHub] carbondata issue #1856: [CARBONDATA-2073][CARBONDATA-1516][Tests] Add test c...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1856
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3127/



---


[GitHub] carbondata issue #2055: [CARBONDATA-2224][File Level Reader Support] Externa...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2055
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4348/



---


[GitHub] carbondata issue #2042: [CARBONDATA-2236]added sdv test cases for standard p...

2018-03-14 Thread praveenmeenakshi56
Github user praveenmeenakshi56 commented on the issue:

https://github.com/apache/carbondata/pull/2042
  
retest this please


---


[GitHub] carbondata issue #1990: [CARBONDATA-2195] Add new test case for partition fe...

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1990
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3890/



---


[GitHub] carbondata issue #1713: [CARBONDATA-1899] Optimize CarbonData concurrency te...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1713
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3128/



---


[GitHub] carbondata issue #2052: [CARBONDATA-2246][DataLoad] Fix exhausted memory pro...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2052
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4349/



---


[GitHub] carbondata issue #2064: [CARBONDATA-2255] Rename the streaming examples

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2064
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3129/



---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4350/



---


[GitHub] carbondata issue #2064: [CARBONDATA-2255] Rename the streaming examples

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2064
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3891/



---


[GitHub] carbondata issue #2050: [CARBONDATA-2244]fix creating pre-aggregate table bu...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2050
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3130/



---


[GitHub] carbondata pull request #2057: [CARBONDATA-2248]Fixed Memory leak in parser/...

2018-03-14 Thread kumarvishal09
Github user kumarvishal09 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2057#discussion_r174682778
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala
 ---
@@ -52,9 +52,12 @@ class CarbonSparkSqlParser(conf: SQLConf, sparkSession: 
SparkSession) extends Ab
   override def parsePlan(sqlText: String): LogicalPlan = {
 CarbonSession.updateSessionInfoToCurrentThread(sparkSession)
 try {
-  super.parsePlan(sqlText)
+  val parsedPlan = super.parsePlan(sqlText)
+  CarbonScalaUtil.cleanParserThreadLocals
--- End diff --

In case of exception it is calling CarbonSpark2SqlParser.parse which is 
already handling clearing thread local object so we cannot handle in finally 


---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3131/



---


[GitHub] carbondata issue #2042: [CARBONDATA-2236]added sdv test cases for standard p...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2042
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4352/



---


[GitHub] carbondata issue #2042: [CARBONDATA-2236]added sdv test cases for standard p...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2042
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3132/



---


[GitHub] carbondata issue #2061: [CARBONDATA-2253][SDK] Support write JSON/Avro data ...

2018-03-14 Thread QiangCai
Github user QiangCai commented on the issue:

https://github.com/apache/carbondata/pull/2061
  
@jackylk 
the test case of store module didn't be executed


---


[GitHub] carbondata issue #2057: [CARBONDATA-2248]Fixed Memory leak in parser/CarbonS...

2018-03-14 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2057
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3892/



---


[GitHub] carbondata issue #2058: [CARBONDATA-2249] Fixed bug for querying data throug...

2018-03-14 Thread jackylk
Github user jackylk commented on the issue:

https://github.com/apache/carbondata/pull/2058
  
@geetikagupta16 please rebase to master and drop the unwanted commit


---


[GitHub] carbondata issue #2042: [CARBONDATA-2236]added sdv test cases for standard p...

2018-03-14 Thread praveenmeenakshi56
Github user praveenmeenakshi56 commented on the issue:

https://github.com/apache/carbondata/pull/2042
  
retest SDV please


---


[GitHub] carbondata issue #2031: [CARBONDATA-2223] Remove unused listeners

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2031
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3133/



---


[GitHub] carbondata issue #2045: [CARBONDATA-2230]Add a path into table path to store...

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2045
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4351/



---


[GitHub] carbondata pull request #2055: [CARBONDATA-2224][File Level Reader Support] ...

2018-03-14 Thread jackylk
Github user jackylk commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2055#discussion_r174689415
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCarbonFileInputFormatWithExternalCarbonTable.scala
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.spark.testsuite.createTable
+
+import java.io.File
+
+import org.apache.commons.io.FileUtils
+import org.apache.spark.sql.test.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import 
org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.util.CarbonUtil
+import org.apache.carbondata.sdk.file.{CarbonWriter, Schema}
+
+
+class TestCarbonFileInputFormatWithExternalCarbonTable extends QueryTest 
with BeforeAndAfterAll {
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, so code expects /. Need to handle 
in code ?
+  writerPath = writerPath.replace("\\", "/");
+
+
+  def buildTestData(persistSchema:Boolean) = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+  if (persistSchema) {
+builder.persistSchemaFile(true)
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  } else {
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  }
+
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case ex: Exception => None
+  case _ => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+// create carbon table and insert data
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  //TO DO, need to remove segment dependency and tableIdentifier Dependency
+  test("read carbondata files (sdk Writer Output) using the Carbonfile ") {
+buildTestData(false)
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//new provider Carbonfile
+sql(
+  s"""CREATE EXTERNAL TABLE sdkOutputTable STORED BY 'Carbonfile' 
LOCATION
+ |'$writerPath' """.stripMargin)
+
+sql("Describe formatted sdkOutputTable").show(false)
--- End 

[GitHub] carbondata issue #2031: [CARBONDATA-2223] Remove unused listeners

2018-03-14 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2031
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4353/



---


[GitHub] carbondata pull request #2055: [CARBONDATA-2224][File Level Reader Support] ...

2018-03-14 Thread jackylk
Github user jackylk commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2055#discussion_r174690041
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCarbonFileInputFormatWithExternalCarbonTable.scala
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.spark.testsuite.createTable
+
+import java.io.File
+
+import org.apache.commons.io.FileUtils
+import org.apache.spark.sql.test.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import 
org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.util.CarbonUtil
+import org.apache.carbondata.sdk.file.{CarbonWriter, Schema}
+
+
+class TestCarbonFileInputFormatWithExternalCarbonTable extends QueryTest 
with BeforeAndAfterAll {
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, so code expects /. Need to handle 
in code ?
+  writerPath = writerPath.replace("\\", "/");
+
+
+  def buildTestData(persistSchema:Boolean) = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+  if (persistSchema) {
+builder.persistSchemaFile(true)
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  } else {
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  }
+
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case ex: Exception => None
+  case _ => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+// create carbon table and insert data
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  //TO DO, need to remove segment dependency and tableIdentifier Dependency
+  test("read carbondata files (sdk Writer Output) using the Carbonfile ") {
+buildTestData(false)
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//new provider Carbonfile
+sql(
+  s"""CREATE EXTERNAL TABLE sdkOutputTable STORED BY 'Carbonfile' 
LOCATION
+ |'$writerPath' """.stripMargin)
+
+sql("Describe formatted sdkOutputTable").show(false)
+
+ 

[GitHub] carbondata pull request #2055: [CARBONDATA-2224][File Level Reader Support] ...

2018-03-14 Thread jackylk
Github user jackylk commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2055#discussion_r174690144
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCarbonFileInputFormatWithExternalCarbonTable.scala
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.spark.testsuite.createTable
+
+import java.io.File
+
+import org.apache.commons.io.FileUtils
+import org.apache.spark.sql.test.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import 
org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.util.CarbonUtil
+import org.apache.carbondata.sdk.file.{CarbonWriter, Schema}
+
+
+class TestCarbonFileInputFormatWithExternalCarbonTable extends QueryTest 
with BeforeAndAfterAll {
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, so code expects /. Need to handle 
in code ?
+  writerPath = writerPath.replace("\\", "/");
+
+
+  def buildTestData(persistSchema:Boolean) = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+  if (persistSchema) {
+builder.persistSchemaFile(true)
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  } else {
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  }
+
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case ex: Exception => None
+  case _ => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+// create carbon table and insert data
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  //TO DO, need to remove segment dependency and tableIdentifier Dependency
+  test("read carbondata files (sdk Writer Output) using the Carbonfile ") {
+buildTestData(false)
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//new provider Carbonfile
+sql(
+  s"""CREATE EXTERNAL TABLE sdkOutputTable STORED BY 'Carbonfile' 
LOCATION
+ |'$writerPath' """.stripMargin)
+
+sql("Describe formatted sdkOutputTable").show(false)
+
+ 

[GitHub] carbondata pull request #2055: [CARBONDATA-2224][File Level Reader Support] ...

2018-03-14 Thread jackylk
Github user jackylk commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2055#discussion_r174690840
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCarbonFileInputFormatWithExternalCarbonTable.scala
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.spark.testsuite.createTable
+
+import java.io.File
+
+import org.apache.commons.io.FileUtils
+import org.apache.spark.sql.test.util.QueryTest
+import org.scalatest.BeforeAndAfterAll
+
+import 
org.apache.carbondata.common.exceptions.sql.MalformedCarbonCommandException
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.util.CarbonUtil
+import org.apache.carbondata.sdk.file.{CarbonWriter, Schema}
+
+
+class TestCarbonFileInputFormatWithExternalCarbonTable extends QueryTest 
with BeforeAndAfterAll {
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, so code expects /. Need to handle 
in code ?
+  writerPath = writerPath.replace("\\", "/");
+
+
+  def buildTestData(persistSchema:Boolean) = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+  if (persistSchema) {
+builder.persistSchemaFile(true)
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  } else {
+
builder.withSchema(Schema.parseJson(schema)).outputPath(writerPath).buildWriterForCSVInput()
+  }
+
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case ex: Exception => None
+  case _ => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+// create carbon table and insert data
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  //TO DO, need to remove segment dependency and tableIdentifier Dependency
+  test("read carbondata files (sdk Writer Output) using the Carbonfile ") {
+buildTestData(false)
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//new provider Carbonfile
+sql(
+  s"""CREATE EXTERNAL TABLE sdkOutputTable STORED BY 'Carbonfile' 
LOCATION
+ |'$writerPath' """.stripMargin)
+
+sql("Describe formatted sdkOutputTable").show(false)
+
+ 

<    1   2   3