[GitHub] carbondata pull request #2332: [CARBONDATA-2514] Added condition to check fo...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2332#discussion_r190128821 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -2382,6 +2382,7 @@ static DataType thriftDataTyopeToWrapperDataType( CarbonIndexFileReader indexFileReader = new CarbonIndexFileReader(); indexFileReader.openThriftReader(indexFilePath); org.apache.carbondata.format.IndexHeader readIndexHeader = indexFileReader.readIndexHeader(); +indexFileReader.closeThriftReader(); --- End diff -- Add a try finally block and move closing of thrift reader to finally block ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2321 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5062/ ---
[GitHub] carbondata issue #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Timestamp g...
Github user sv71294 commented on the issue: https://github.com/apache/carbondata/pull/2334 Please review it now ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user akashrn5 commented on the issue: https://github.com/apache/carbondata/pull/2321 retest sdv please ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user akashrn5 commented on the issue: https://github.com/apache/carbondata/pull/2321 retest this please ---
[jira] [Created] (CARBONDATA-2517) Improve the coverage of carbon test case
xubo245 created CARBONDATA-2517: --- Summary: Improve the coverage of carbon test case Key: CARBONDATA-2517 URL: https://issues.apache.org/jira/browse/CARBONDATA-2517 Project: CarbonData Issue Type: Improvement Reporter: xubo245 Assignee: xubo245 -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2308 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6057/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2308 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5061/ ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user xubo245 commented on the issue: https://github.com/apache/carbondata/pull/2318 @sounakr Doneï¼and CI pass ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4897/ ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6056/ ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2318 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5060/ ---
[GitHub] carbondata issue #2252: [CARBONDATA-2420] Support string longer than 32000 c...
Github user xuchuanyin commented on the issue: https://github.com/apache/carbondata/pull/2252 @kumarvishal09 You suggested to add a new TEXT encoder and using this encoder while writing But currently in CarbonData, all dimensions are considered as string, there is no specified encoder for it. In the previous description, ``` For DimensionStoreType, I changed VARIABLELENGTH to VARIABLE_INT_LENGTH and VARIABLE_SHORT_LENGTH, they are used for encoding/decoding dimensions ``` We can consider the DimensionStoreType as an encoder. It had two valuesï¼ FIXEDLENGTH and VARIABLELENGTH and I extended it to treeï¼FIXED_LENGTHãVARIABLE_SHORT_LENGTHãVARIABLE_INT_LENGTH. Does this meet your suggestion? ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user Indhumathi27 commented on the issue: https://github.com/apache/carbondata/pull/2308 retest sdv please ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r190104281 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v2/CompressedDimensionChunkFileBasedReaderV2.java --- @@ -121,6 +121,7 @@ public DimensionColumnPage decodeColumnPage( int[] invertedIndexesReverse = new int[0]; int[] rlePage = null; DataChunk2 dimensionColumnChunk = null; +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- The same as above ---
[GitHub] carbondata issue #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Timestamp g...
Github user xubo245 commented on the issue: https://github.com/apache/carbondata/pull/2334 Please optimize the code style of this PR ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r190104257 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v1/CompressedDimensionChunkFileBasedReaderV1.java --- @@ -99,6 +99,7 @@ public CompressedDimensionChunkFileBasedReaderV1(final BlockletInfo blockletInfo @Override public DimensionColumnPage decodeColumnPage( DimensionRawColumnChunk dimensionRawColumnChunk, int pageNumber) throws IOException { +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- I know, you mean just leave it false? ---
[GitHub] carbondata pull request #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Time...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2334#discussion_r190103982 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java --- @@ -92,14 +69,15 @@ private static DataType Spi2CarbondataTypeMapper(CarbondataColumnHandle carbonda else if (colType == DateType.DATE) return DataTypes.DATE; else if (colType == TimestampType.TIMESTAMP) return DataTypes.TIMESTAMP; else if (colType.equals(DecimalType.createDecimalType(carbondataColumnHandle.getPrecision(), -carbondataColumnHandle.getScale( return DataTypes -.createDecimalType(carbondataColumnHandle.getPrecision(), -carbondataColumnHandle.getScale()); +carbondataColumnHandle.getScale( return DataTypes --- End diff -- Why change this one? ---
[GitHub] carbondata pull request #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Time...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2334#discussion_r190104064 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java --- @@ -110,28 +88,30 @@ else if (colType.equals(DecimalType.createDecimalType(carbondataColumnHandle.get for (ColumnHandle columnHandle : originalConstraint.getDomains().get().keySet()) { CarbondataColumnHandle carbondataColumnHandle = (CarbondataColumnHandle) columnHandle; List partitionedColumnSchema = columnSchemas.stream().filter( - columnSchema -> carbondataColumnHandle.getColumnName().equals(columnSchema.getColumnName())).collect(toList()); - if(partitionedColumnSchema.size() != 0) { + columnSchema -> carbondataColumnHandle.getColumnName().equals(columnSchema.getColumnName())).collect(toList()); + if (partitionedColumnSchema.size() != 0) { filter.addAll(createPartitionFilters(originalConstraint, carbondataColumnHandle)); } } return filter; } - /** Returns list of partition key and values using domain constraints + /** + * Returns list of partition key and values using domain constraints + * * @param originalConstraint * @param carbonDataColumnHandle */ private static List createPartitionFilters(TupleDomain originalConstraint, - CarbondataColumnHandle carbonDataColumnHandle) { + CarbondataColumnHandle carbonDataColumnHandle) { --- End diff -- Why change this one? ---
[GitHub] carbondata pull request #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Time...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2334#discussion_r190104019 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java --- @@ -110,28 +88,30 @@ else if (colType.equals(DecimalType.createDecimalType(carbondataColumnHandle.get for (ColumnHandle columnHandle : originalConstraint.getDomains().get().keySet()) { CarbondataColumnHandle carbondataColumnHandle = (CarbondataColumnHandle) columnHandle; List partitionedColumnSchema = columnSchemas.stream().filter( - columnSchema -> carbondataColumnHandle.getColumnName().equals(columnSchema.getColumnName())).collect(toList()); - if(partitionedColumnSchema.size() != 0) { + columnSchema -> carbondataColumnHandle.getColumnName().equals(columnSchema.getColumnName())).collect(toList()); --- End diff -- Why change this one? ---
[GitHub] carbondata pull request #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Time...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2334#discussion_r190103900 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java --- @@ -17,51 +17,28 @@ package org.apache.carbondata.presto; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.Timestamp; -import java.util.ArrayList; -import java.util.Calendar; -import java.util.Date; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - +import com.facebook.presto.spi.ColumnHandle; +import com.facebook.presto.spi.PrestoException; +import com.facebook.presto.spi.predicate.Domain; +import com.facebook.presto.spi.predicate.Range; +import com.facebook.presto.spi.predicate.TupleDomain; +import com.facebook.presto.spi.type.*; +import io.airlift.slice.Slice; import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.datatype.DataTypes; import org.apache.carbondata.core.metadata.schema.table.CarbonTable; import org.apache.carbondata.core.metadata.schema.table.column.ColumnSchema; import org.apache.carbondata.core.scan.expression.ColumnExpression; import org.apache.carbondata.core.scan.expression.Expression; import org.apache.carbondata.core.scan.expression.LiteralExpression; -import org.apache.carbondata.core.scan.expression.conditional.EqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.GreaterThanEqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.GreaterThanExpression; -import org.apache.carbondata.core.scan.expression.conditional.InExpression; -import org.apache.carbondata.core.scan.expression.conditional.LessThanEqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.LessThanExpression; -import org.apache.carbondata.core.scan.expression.conditional.ListExpression; +import org.apache.carbondata.core.scan.expression.conditional.*; import org.apache.carbondata.core.scan.expression.logical.AndExpression; import org.apache.carbondata.core.scan.expression.logical.OrExpression; -import com.facebook.presto.spi.ColumnHandle; -import com.facebook.presto.spi.PrestoException; -import com.facebook.presto.spi.predicate.Domain; -import com.facebook.presto.spi.predicate.Range; -import com.facebook.presto.spi.predicate.TupleDomain; -import com.facebook.presto.spi.type.BigintType; -import com.facebook.presto.spi.type.BooleanType; -import com.facebook.presto.spi.type.DateType; -import com.facebook.presto.spi.type.DecimalType; -import com.facebook.presto.spi.type.Decimals; -import com.facebook.presto.spi.type.DoubleType; -import com.facebook.presto.spi.type.IntegerType; -import com.facebook.presto.spi.type.SmallintType; -import com.facebook.presto.spi.type.TimestampType; -import com.facebook.presto.spi.type.Type; -import com.facebook.presto.spi.type.VarcharType; -import com.google.common.collect.ImmutableList; -import io.airlift.slice.Slice; +import java.math.BigDecimal; --- End diff -- please optimize the order of class, unify with others ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r190103945 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v2/CompressedDimensionChunkFileBasedReaderV2.java --- @@ -121,6 +121,7 @@ public DimensionColumnPage decodeColumnPage( int[] invertedIndexesReverse = new int[0]; int[] rlePage = null; DataChunk2 dimensionColumnChunk = null; +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- The same as above. V2 does not support longStringColumn, so it is fine that the `isLongStringColumn` will always be false. ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r190103851 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v1/CompressedDimensionChunkFileBasedReaderV1.java --- @@ -99,6 +99,7 @@ public CompressedDimensionChunkFileBasedReaderV1(final BlockletInfo blockletInfo @Override public DimensionColumnPage decodeColumnPage( DimensionRawColumnChunk dimensionRawColumnChunk, int pageNumber) throws IOException { +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- Should we support longStringColumn in V1? Carbondata now only support writing V1 format. ---
[GitHub] carbondata pull request #2334: [CARBONDATA-2515][CARBONDATA-2516] fixed Time...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2334#discussion_r190103764 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/PrestoFilterUtil.java --- @@ -17,51 +17,28 @@ package org.apache.carbondata.presto; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.Timestamp; -import java.util.ArrayList; -import java.util.Calendar; -import java.util.Date; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - +import com.facebook.presto.spi.ColumnHandle; +import com.facebook.presto.spi.PrestoException; +import com.facebook.presto.spi.predicate.Domain; +import com.facebook.presto.spi.predicate.Range; +import com.facebook.presto.spi.predicate.TupleDomain; +import com.facebook.presto.spi.type.*; +import io.airlift.slice.Slice; import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.datatype.DataTypes; import org.apache.carbondata.core.metadata.schema.table.CarbonTable; import org.apache.carbondata.core.metadata.schema.table.column.ColumnSchema; import org.apache.carbondata.core.scan.expression.ColumnExpression; import org.apache.carbondata.core.scan.expression.Expression; import org.apache.carbondata.core.scan.expression.LiteralExpression; -import org.apache.carbondata.core.scan.expression.conditional.EqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.GreaterThanEqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.GreaterThanExpression; -import org.apache.carbondata.core.scan.expression.conditional.InExpression; -import org.apache.carbondata.core.scan.expression.conditional.LessThanEqualToExpression; -import org.apache.carbondata.core.scan.expression.conditional.LessThanExpression; -import org.apache.carbondata.core.scan.expression.conditional.ListExpression; +import org.apache.carbondata.core.scan.expression.conditional.*; import org.apache.carbondata.core.scan.expression.logical.AndExpression; import org.apache.carbondata.core.scan.expression.logical.OrExpression; -import com.facebook.presto.spi.ColumnHandle; -import com.facebook.presto.spi.PrestoException; -import com.facebook.presto.spi.predicate.Domain; -import com.facebook.presto.spi.predicate.Range; -import com.facebook.presto.spi.predicate.TupleDomain; -import com.facebook.presto.spi.type.BigintType; -import com.facebook.presto.spi.type.BooleanType; -import com.facebook.presto.spi.type.DateType; -import com.facebook.presto.spi.type.DecimalType; -import com.facebook.presto.spi.type.Decimals; -import com.facebook.presto.spi.type.DoubleType; -import com.facebook.presto.spi.type.IntegerType; -import com.facebook.presto.spi.type.SmallintType; -import com.facebook.presto.spi.type.TimestampType; -import com.facebook.presto.spi.type.Type; -import com.facebook.presto.spi.type.VarcharType; -import com.google.common.collect.ImmutableList; -import io.airlift.slice.Slice; +import java.math.BigDecimal; +import java.math.BigInteger; +import java.sql.Timestamp; +import java.util.*; --- End diff -- Please keep the detail class, not * ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user xuchuanyin commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r190103691 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/impl/DimensionRawColumnChunk.java --- @@ -92,8 +93,10 @@ public DimensionColumnPage decodeColumnPage(int pageNumber) { * @param index * @return */ - public DimensionColumnPage convertToDimColDataChunkWithOutCache(int index) { + public DimensionColumnPage convertToDimColDataChunkWithOutCache(int index, + boolean isLongStringColumn) { --- End diff -- What does `reader` mean? ---
[GitHub] carbondata pull request #2318: [CARBONDATA-2491] Fix the error when reader r...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2318#discussion_r190103023 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -77,6 +85,24 @@ public void testWriteAndReadFiles() throws IOException, InterruptedException { Assert.assertEquals(i, 100); reader.close(); + +// Read again +CarbonReader reader2 = CarbonReader +.builder(path, "_temp") +.projection(new String[]{"name", "age"}) --- End diff -- ok, done ---
[GitHub] carbondata pull request #2318: [CARBONDATA-2491] Fix the error when reader r...
Github user xubo245 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2318#discussion_r190102945 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -77,6 +85,24 @@ public void testWriteAndReadFiles() throws IOException, InterruptedException { Assert.assertEquals(i, 100); reader.close(); + --- End diff -- Ok, I will add. What's more, search mode has used CarbonRecordReader, there are some test case to concurrent run in org.apache.carbondata.examples.SearchModeExample. ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2308 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5059/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2308 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4896/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2308 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6055/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2308 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5058/ ---
[jira] [Updated] (CARBONDATA-1890) Projections of struct fields push down to carbon
[ https://issues.apache.org/jira/browse/CARBONDATA-1890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1890: Fix Version/s: (was: 1.4.0) > Projections of struct fields push down to carbon > - > > Key: CARBONDATA-1890 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1890 > Project: CarbonData > Issue Type: Sub-task > Components: core, sql >Reporter: Ashwini K >Assignee: Ashwini K >Priority: Major > Time Spent: 5h > Remaining Estimate: 0h > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1875) Refactor and annotate load options
[ https://issues.apache.org/jira/browse/CARBONDATA-1875?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1875: Fix Version/s: (was: 1.4.0) > Refactor and annotate load options > -- > > Key: CARBONDATA-1875 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1875 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1874) Refactor and annotate table property
[ https://issues.apache.org/jira/browse/CARBONDATA-1874?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1874: Fix Version/s: (was: 1.4.0) > Refactor and annotate table property > > > Key: CARBONDATA-1874 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1874 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1823) Add documentation for Register table DDL
[ https://issues.apache.org/jira/browse/CARBONDATA-1823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1823: Fix Version/s: (was: 1.4.0) > Add documentation for Register table DDL > > > Key: CARBONDATA-1823 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1823 > Project: CarbonData > Issue Type: Sub-task > Components: spark-integration >Affects Versions: 1.3.0 >Reporter: Mohammad Shahid Khan >Assignee: Indhumathi Muthumurugesh >Priority: Major > > Please refer to CARBONDATA-1822 and mailing discussion for more details. > http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/DDL-for-CarbonData-table-backup-and-recovery-new-feature-td27854.html -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1873) Refactor and annotate carbon property
[ https://issues.apache.org/jira/browse/CARBONDATA-1873?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1873: Fix Version/s: (was: 1.4.0) > Refactor and annotate carbon property > - > > Key: CARBONDATA-1873 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1873 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1640) Implement Hash-Range partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1640: Fix Version/s: (was: 1.4.0) > Implement Hash-Range partitioner > > > Key: CARBONDATA-1640 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1640 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1644) Change query process to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1644?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1644: Fix Version/s: (was: 1.4.0) > Change query process to support two level partitions > > > Key: CARBONDATA-1644 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1644 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1645) Change alter table add/split partition to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1645: Fix Version/s: (was: 1.4.0) > Change alter table add/split partition to support two level partitions > -- > > Key: CARBONDATA-1645 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1645 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1648) Change alter table drop partition to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1648?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1648: Fix Version/s: (was: 1.4.0) > Change alter table drop partition to support two level partitions > - > > Key: CARBONDATA-1648 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1648 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1641) Implement Hash-List partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1641?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1641: Fix Version/s: (was: 1.4.0) > Implement Hash-List partitioner > --- > > Key: CARBONDATA-1641 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1641 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1642) Implement Hash-Hash partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1642: Fix Version/s: (was: 1.4.0) > Implement Hash-Hash partitioner > --- > > Key: CARBONDATA-1642 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1642 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1647) Change show partition to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1647: Fix Version/s: (was: 1.4.0) > Change show partition to support two level partitions > - > > Key: CARBONDATA-1647 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1647 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1646) Concurrent performance testing of partition tables
[ https://issues.apache.org/jira/browse/CARBONDATA-1646?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1646: Fix Version/s: (was: 1.4.0) > Concurrent performance testing of partition tables > -- > > Key: CARBONDATA-1646 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1646 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1643) Change load process to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1643: Fix Version/s: (was: 1.4.0) > Change load process to support two level partitions > --- > > Key: CARBONDATA-1643 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1643 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1634) Implement Range-Range partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1634?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1634: Fix Version/s: (was: 1.4.0) > Implement Range-Range partitioner > - > > Key: CARBONDATA-1634 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1634 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1635) Implement Range-List partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1635?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1635: Fix Version/s: (was: 1.4.0) > Implement Range-List partitioner > > > Key: CARBONDATA-1635 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1635 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1639) Implement List-Hash partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1639: Fix Version/s: (was: 1.4.0) > Implement List-Hash partitioner > --- > > Key: CARBONDATA-1639 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1639 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1636) Implement Range-Hash partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1636: Fix Version/s: (was: 1.4.0) > Implement Range-Hash partitioner > > > Key: CARBONDATA-1636 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1636 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1638) Implement List-List partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1638?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1638: Fix Version/s: (was: 1.4.0) > Implement List-List partitioner > --- > > Key: CARBONDATA-1638 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1638 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1637) Implement List-Range partitioner
[ https://issues.apache.org/jira/browse/CARBONDATA-1637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1637: Fix Version/s: (was: 1.4.0) > Implement List-Range partitioner > > > Key: CARBONDATA-1637 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1637 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1632) Change PartitionInfo and related model to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1632?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1632: Fix Version/s: (was: 1.4.0) > Change PartitionInfo and related model to support two level partitions > -- > > Key: CARBONDATA-1632 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1632 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1633) Change parser to support two level partitions
[ https://issues.apache.org/jira/browse/CARBONDATA-1633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1633: Fix Version/s: (was: 1.4.0) > Change parser to support two level partitions > - > > Key: CARBONDATA-1633 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1633 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration, sql >Reporter: Cao, Lionel >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2308 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4895/ ---
[jira] [Updated] (CARBONDATA-1577) Support load new data into datamap by SQL
[ https://issues.apache.org/jira/browse/CARBONDATA-1577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1577: Fix Version/s: (was: 1.4.0) > Support load new data into datamap by SQL > - > > Key: CARBONDATA-1577 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1577 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1578) Support load datamap for existing table
[ https://issues.apache.org/jira/browse/CARBONDATA-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1578: Fix Version/s: (was: 1.4.0) > Support load datamap for existing table > --- > > Key: CARBONDATA-1578 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1578 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2308 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6054/ ---
[jira] [Updated] (CARBONDATA-1587) Integrate with Kafka and Spark structured streaming to ingest data
[ https://issues.apache.org/jira/browse/CARBONDATA-1587?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1587: Fix Version/s: (was: 1.4.0) > Integrate with Kafka and Spark structured streaming to ingest data > -- > > Key: CARBONDATA-1587 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1587 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > > Should support ingest data from kafka and spark structured streaming into > carbon streaming table. > The solution should provide E2E exactly once semantic and fault tolerance. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1502) Compaction Support: Enhance Test Cases for Struct DataType
[ https://issues.apache.org/jira/browse/CARBONDATA-1502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1502: Fix Version/s: (was: 1.4.0) > Compaction Support: Enhance Test Cases for Struct DataType > -- > > Key: CARBONDATA-1502 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1502 > Project: CarbonData > Issue Type: Sub-task > Components: core, sql >Reporter: Pawan Malwal >Assignee: Pawan Malwal >Priority: Minor > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1501) Update Array values
[ https://issues.apache.org/jira/browse/CARBONDATA-1501?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1501: Fix Version/s: (was: 1.4.0) > Update Array values > --- > > Key: CARBONDATA-1501 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1501 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration >Reporter: Venkata Ramana G >Assignee: Ashwini K >Priority: Minor > > Update Array values. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Resolved] (CARBONDATA-1575) Support large scale data on DataMap
[ https://issues.apache.org/jira/browse/CARBONDATA-1575?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-1575. - Resolution: Fixed > Support large scale data on DataMap > --- > > Key: CARBONDATA-1575 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1575 > Project: CarbonData > Issue Type: Sub-task >Affects Versions: 1.2.0 >Reporter: Jacky Li >Priority: Major > Fix For: 1.4.0 > > > This ticket is to track the data scale test on datamap > Fixes need to be handle if any problems are found in large scale test. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1510) Load,query, filter, NULL values, UDFs, Describe support
[ https://issues.apache.org/jira/browse/CARBONDATA-1510?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1510: Fix Version/s: (was: 1.4.0) > Load,query, filter, NULL values, UDFs, Describe support > --- > > Key: CARBONDATA-1510 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1510 > Project: CarbonData > Issue Type: Sub-task > Components: core, sql >Reporter: Rahul Kumar >Assignee: Rahul Kumar >Priority: Major > Time Spent: 4h 10m > Remaining Estimate: 0h > > Implementation in place needs to add test-cases and bug fix -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1496) Array type : insert into table support
[ https://issues.apache.org/jira/browse/CARBONDATA-1496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1496: Fix Version/s: (was: 1.4.0) > Array type : insert into table support > -- > > Key: CARBONDATA-1496 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1496 > Project: CarbonData > Issue Type: Sub-task > Components: data-load >Reporter: Venkata Ramana G >Priority: Major > > # Source table data containing Array data needs to convert from spark > datatype to string , as carbon takes string as input row -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1499) Support Array type to be a measure
[ https://issues.apache.org/jira/browse/CARBONDATA-1499?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1499: Fix Version/s: (was: 1.4.0) > Support Array type to be a measure > -- > > Key: CARBONDATA-1499 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1499 > Project: CarbonData > Issue Type: Sub-task > Components: core, data-load, data-query >Reporter: Venkata Ramana G >Priority: Major > > Currently supports only dimensions -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1497) Support DDL for Array fields Dictionary include and Dictionary Exclude
[ https://issues.apache.org/jira/browse/CARBONDATA-1497?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1497: Fix Version/s: (was: 1.4.0) > Support DDL for Array fields Dictionary include and Dictionary Exclude > -- > > Key: CARBONDATA-1497 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1497 > Project: CarbonData > Issue Type: Sub-task > Components: spark-integration >Reporter: Venkata Ramana G >Priority: Major > > Also needs to handle CarbonDictionaryDecoder to handle the same. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1498) Support multilevel Array
[ https://issues.apache.org/jira/browse/CARBONDATA-1498?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1498: Fix Version/s: (was: 1.4.0) > Support multilevel Array > > > Key: CARBONDATA-1498 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1498 > Project: CarbonData > Issue Type: Sub-task > Components: core, spark-integration >Reporter: Venkata Ramana G >Priority: Major > > currently DDL is validated to allow only 2 levels, remove this restriction -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1494) Load, query, filter, NULL values, UDFs, Describe support
[ https://issues.apache.org/jira/browse/CARBONDATA-1494?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1494: Fix Version/s: (was: 1.4.0) > Load, query, filter, NULL values, UDFs, Describe support > > > Key: CARBONDATA-1494 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1494 > Project: CarbonData > Issue Type: Sub-task > Components: core, sql >Reporter: Venkata Ramana G >Assignee: Rahul Kumar >Priority: Major > > Implementation in place needs to add test-cases and bug fix -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1127) Add fixed length encoding for timestamp/date data type
[ https://issues.apache.org/jira/browse/CARBONDATA-1127?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1127: Fix Version/s: (was: 1.4.0) > Add fixed length encoding for timestamp/date data type > -- > > Key: CARBONDATA-1127 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1127 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > > After this is done, we can move timestamp/date from noDictionary to normal > ColumnPage (stored as int[] internally) -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1492) Alter add and remove struct Column
[ https://issues.apache.org/jira/browse/CARBONDATA-1492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1492: Fix Version/s: (was: 1.4.0) > Alter add and remove struct Column > -- > > Key: CARBONDATA-1492 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1492 > Project: CarbonData > Issue Type: Sub-task > Components: core, sql >Reporter: dhatchayani >Assignee: dhatchayani >Priority: Minor > Time Spent: 8h 40m > Remaining Estimate: 0h > > Alter table add and remove struct columns should be supported as a part of > this. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1125) Add SQL and dataframe option for encoding override
[ https://issues.apache.org/jira/browse/CARBONDATA-1125?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1125: Fix Version/s: (was: 1.4.0) > Add SQL and dataframe option for encoding override > -- > > Key: CARBONDATA-1125 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1125 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > > User should be able to specify the encoding type for particular field in the > table -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (CARBONDATA-1016) Make sort step output ColumnPage
[ https://issues.apache.org/jira/browse/CARBONDATA-1016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala updated CARBONDATA-1016: Fix Version/s: (was: 1.4.0) > Make sort step output ColumnPage > > > Key: CARBONDATA-1016 > URL: https://issues.apache.org/jira/browse/CARBONDATA-1016 > Project: CarbonData > Issue Type: Sub-task >Reporter: Jacky Li >Priority: Major > > Currently in UnsafeInMemoryIntermediateDataMerger, it stores data using > UnsafeCarbonRowPage[] unsafeCarbonRowPages. This will be more efficient if we > change it to use ColumnPage which is the main data structure used in write > step. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user Indhumathi27 commented on the issue: https://github.com/apache/carbondata/pull/2308 retest sdv please ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2308 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5057/ ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2321 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4891/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user Indhumathi27 commented on the issue: https://github.com/apache/carbondata/pull/2308 retest sdv please ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2321 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6050/ ---
[GitHub] carbondata issue #1853: [CARBONDATA-2072][TEST] Add dropTables method for op...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/1853 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6052/ ---
[GitHub] carbondata issue #1853: [CARBONDATA-2072][TEST] Add dropTables method for op...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/1853 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4893/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2308 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5056/ ---
[GitHub] carbondata pull request #2330: [HOTFIX] Implementing getMemorySize in Blockl...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2330 ---
[GitHub] carbondata issue #2330: [HOTFIX] Implementing getMemorySize in BlockletDataM...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2330 LGTM ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4892/ ---
[GitHub] carbondata issue #2321: [WIP]clean and close datamap writers on any task fai...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2321 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5055/ ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6051/ ---
[GitHub] carbondata issue #2327: [WIP] Bloom remove guava cache and use CarbonCache
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2327 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5054/ ---
[GitHub] carbondata issue #2332: [CARBONDATA-2514] Added condition to check for dupli...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2332 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4888/ ---
[GitHub] carbondata issue #2308: [WIP]Adding SDV Testcases for SDKwriter
Github user Indhumathi27 commented on the issue: https://github.com/apache/carbondata/pull/2308 retest sdv please ---
[GitHub] carbondata issue #2327: [WIP] Bloom remove guava cache and use CarbonCache
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2327 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4890/ ---
[GitHub] carbondata issue #2330: [HOTFIX] Implementing getMemorySize in BlockletDataM...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2330 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4889/ ---
[GitHub] carbondata issue #2333: [WIP] Change the query flow while selecting the carb...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2333 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5053/ ---
[GitHub] carbondata issue #2330: [HOTFIX] Implementing getMemorySize in BlockletDataM...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2330 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6048/ ---
[GitHub] carbondata pull request #2318: [CARBONDATA-2491] Fix the error when reader r...
Github user sounakr commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2318#discussion_r189923456 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -77,6 +85,24 @@ public void testWriteAndReadFiles() throws IOException, InterruptedException { Assert.assertEquals(i, 100); reader.close(); + +// Read again +CarbonReader reader2 = CarbonReader +.builder(path, "_temp") +.projection(new String[]{"name", "age"}) --- End diff -- Add a test case of two sequential reads but without closing the 1st reader, 2nd reader starts. ---
[GitHub] carbondata issue #2327: [WIP] Bloom remove guava cache and use CarbonCache
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2327 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6049/ ---
[GitHub] carbondata pull request #2318: [CARBONDATA-2491] Fix the error when reader r...
Github user sounakr commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2318#discussion_r189922516 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/CarbonReaderTest.java --- @@ -77,6 +85,24 @@ public void testWriteAndReadFiles() throws IOException, InterruptedException { Assert.assertEquals(i, 100); reader.close(); + --- End diff -- This test case points to sequential read. One reader gets closed and second one starts. What exactly happens when there is parallel read of two readers. Can we have a test case for that? ---
[GitHub] carbondata issue #2332: [CARBONDATA-2514] Added condition to check for dupli...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2332 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6047/ ---
[GitHub] carbondata issue #2318: [CARBONDATA-2491] Fix the error when reader read twi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4887/ ---
[GitHub] carbondata issue #2330: [HOTFIX] Implementing getMemorySize in BlockletDataM...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2330 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4886/ ---
[GitHub] carbondata issue #2252: [CARBONDATA-2420] Support string longer than 32000 c...
Github user kumarvishal09 commented on the issue: https://github.com/apache/carbondata/pull/2252 @xuchuanyin I think some of the changes are not required ...like method added for converting to LV format..If u check direct compressor it is already present ..you can use the same. In DimensionRawColumnChunk no need to pass any Boolean based on encoder we can decide which type of dimension store object needs to be created Changing the existing store chunk implementation is also not required ...add a child class if possible or add complete new implementation for storing int based LV format... Please check ColumnPage.compress and ColumnPage.decompress for you reference(LV) V1,V2 reader no need to change anything as its old format code and user will not able to load the data in this format. Decide based on encoder. while writing add new encoder "TEXT" and while reading use same encoder for creating DimensionDataChunkStore object ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user kumarvishal09 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r189912968 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v1/CompressedDimensionChunkFileBasedReaderV1.java --- @@ -99,6 +99,7 @@ public CompressedDimensionChunkFileBasedReaderV1(final BlockletInfo blockletInfo @Override public DimensionColumnPage decodeColumnPage( DimensionRawColumnChunk dimensionRawColumnChunk, int pageNumber) throws IOException { +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- In V1 case it will be always false...as new encoder type will be only supported for V3 format ---
[GitHub] carbondata pull request #2252: [CARBONDATA-2420] Support string longer than ...
Github user kumarvishal09 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2252#discussion_r189913075 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/chunk/reader/dimension/v2/CompressedDimensionChunkFileBasedReaderV2.java --- @@ -121,6 +121,7 @@ public DimensionColumnPage decodeColumnPage( int[] invertedIndexesReverse = new int[0]; int[] rlePage = null; DataChunk2 dimensionColumnChunk = null; +boolean isLongStringColumn = dimensionRawColumnChunk.isLongStringColumn(); --- End diff -- In V2 case it will be always false...as new encoder type will be only supported for V3 format ---