[GitHub] carbondata issue #2482: [CARBONDATA-2714] Support merge index files for the ...
Github user dhatchayani commented on the issue: https://github.com/apache/carbondata/pull/2482 retest this please ---
[GitHub] carbondata pull request #2466: [CARBONDATA-2710][Spark Integration] Refactor...
Github user KanakaKumar commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2466#discussion_r202576876 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala --- @@ -169,220 +128,45 @@ class CarbonHelperSqlAstBuilder(conf: SQLConf, provider) = createTableTuple val (tableIdentifier, temp, ifNotExists, external) = visitCreateTableHeader(tableHeader) - -// TODO: implement temporary tables -if (temp) { - throw new ParseException( -"CREATE TEMPORARY TABLE is not supported yet. " + -"Please use CREATE TEMPORARY VIEW as an alternative.", tableHeader) -} -if (skewSpecContext != null) { - operationNotAllowed("CREATE TABLE ... SKEWED BY", skewSpecContext) -} -if (bucketSpecContext != null) { - operationNotAllowed("CREATE TABLE ... CLUSTERED BY", bucketSpecContext) -} - -val cols = Option(columns).toSeq.flatMap(visitColTypeList) -val properties = getPropertyKeyValues(tablePropertyList) - -// Ensuring whether no duplicate name is used in table definition -val colNames = cols.map(_.name) -if (colNames.length != colNames.distinct.length) { - val duplicateColumns = colNames.groupBy(identity).collect { -case (x, ys) if ys.length > 1 => "\"" + x + "\"" - } - operationNotAllowed(s"Duplicated column names found in table definition of " + - s"$tableIdentifier: ${duplicateColumns.mkString("[", ",", "]")}", columns) -} - -val tablePath = if (locationSpecContext != null) { +val cols: Seq[StructField] = Option(columns).toSeq.flatMap(visitColTypeList) +val colNames: Seq[String] = CarbonSparkSqlParserUtil + .validateCreateTableReqAndGetColumns(tableHeader, +skewSpecContext, +bucketSpecContext, +columns, +cols, +tableIdentifier, +temp) +val tablePath: Option[String] = if (locationSpecContext != null) { Some(visitLocationSpec(locationSpecContext)) } else { None } val tableProperties = mutable.Map[String, String]() +val properties: Map[String, String] = getPropertyKeyValues(tablePropertyList) properties.foreach{property => tableProperties.put(property._1, property._2)} // validate partition clause val (partitionByStructFields, partitionFields) = validatePartitionFields(partitionColumns, colNames, tableProperties) -// validate partition clause -if (partitionFields.nonEmpty) { - if (!CommonUtil.validatePartitionColumns(tableProperties, partitionFields)) { - throw new MalformedCarbonCommandException("Error: Invalid partition definition") - } - // partition columns should not be part of the schema - val badPartCols = partitionFields -.map(_.partitionColumn.toLowerCase) -.toSet -.intersect(colNames.map(_.toLowerCase).toSet) - - if (badPartCols.nonEmpty) { -operationNotAllowed(s"Partition columns should not be specified in the schema: " + -badPartCols.map("\"" + _ + "\"").mkString("[", ",", "]"), - partitionColumns) - } -} - -val options = new CarbonOption(properties) -// validate streaming property -validateStreamingProperty(options) -var fields = parser.getFields(cols ++ partitionByStructFields) // validate for create table as select val selectQuery = Option(query).map(plan) -selectQuery match { - case Some(q) => -// create table as select does not allow creation of partitioned table -if (partitionFields.nonEmpty) { - val errorMessage = "A Create Table As Select (CTAS) statement is not allowed to " + - "create a partitioned table using Carbondata file formats." - operationNotAllowed(errorMessage, partitionColumns) -} -// create table as select does not allow to explicitly specify schema -if (fields.nonEmpty) { - operationNotAllowed( -"Schema may not be specified in a Create Table As Select (CTAS) statement", columns) -} -// external table is not allow -if (external) { - operationNotAllowed("Create external table as select", tableHeader) -} -fields = parser - .getFields(CarbonEnv.getInstance(sparkSession).carbonMetastore -.getSchemaFromUnresolvedRelation(sparkSession, Some(q).get)) - case _ => -// ignore this case -} -
[GitHub] carbondata pull request #2507: [CABONDATA-2741]Fix for fetching random query...
Github user manishgupta88 commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2507#discussion_r202576343 --- Diff: integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala --- @@ -709,6 +709,22 @@ test("test alter command for boolean data type with correct default measure valu Seq(Row(1)) ) } + + test("Alter table selection in random order"){ +sql("drop database if exists carbon cascade") +sql(s"create database carbon location '$dblocation'") +sql("use carbon") +sql("create table brinjal (imei string,channelsId string,gamePointId double,deviceInformationId double," + --- End diff -- No need to create database in the test case. Use default database and also use a unique tableName... ---
[GitHub] carbondata issue #2466: [CARBONDATA-2710][Spark Integration] Refactor Carbon...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2466 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5977/ ---
[GitHub] carbondata pull request #2488: [CARBONDATA-2724][DataMap]Unsupported create ...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2488#discussion_r202575874 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -3231,4 +3231,42 @@ private static int unsetLocalDictForComplexColumns(List allColumns } return columnLocalDictGenMap; } + + /** + * This method get the carbon file format version + * + * @param carbonTable + * carbon Table + */ + public static ColumnarFormatVersion getFormatVersion(CarbonTable carbonTable) throws IOException + { +String tablePath = carbonTable.getTablePath(); +CarbonFile[] carbonFiles = FileFactory +.getCarbonFile(tablePath) +.listFiles(new CarbonFileFilter() { + @Override + public boolean accept(CarbonFile file) { +if (file == null) { + return false; +} +return file.getName().endsWith("carbonindex"); --- End diff -- It should not be under carbontable object. It is file version and it is related to each load, not table. @manishgupta88 I think your PR already handles the validation of datamap loading on old formats right? ---
[GitHub] carbondata pull request #2488: [CARBONDATA-2724][DataMap]Unsupported create ...
Github user ravipesala commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2488#discussion_r202575655 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -3231,4 +3231,42 @@ private static int unsetLocalDictForComplexColumns(List allColumns } return columnLocalDictGenMap; } + + /** + * This method get the carbon file format version + * + * @param carbonTable + * carbon Table + */ + public static ColumnarFormatVersion getFormatVersion(CarbonTable carbonTable) throws IOException + { +String tablePath = carbonTable.getTablePath(); --- End diff -- This check would be wrong, carbon files not always be under table path directly. ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2474 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5862/ ---
[jira] [Resolved] (CARBONDATA-2482) Pass uuid while writing segment file if possible
[ https://issues.apache.org/jira/browse/CARBONDATA-2482?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala resolved CARBONDATA-2482. - Resolution: Fixed Fix Version/s: 1.4.1 > Pass uuid while writing segment file if possible > > > Key: CARBONDATA-2482 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2482 > Project: CarbonData > Issue Type: Improvement >Reporter: dhatchayani >Assignee: dhatchayani >Priority: Minor > Fix For: 1.4.1 > > Time Spent: 4h 20m > Remaining Estimate: 0h > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2307: [CARBONDATA-2482] Pass uuid while writing seg...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2307 ---
[GitHub] carbondata issue #2307: [CARBONDATA-2482] Pass uuid while writing segment fi...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2307 LGTM ---
[GitHub] carbondata issue #2502: [CARBONDATA-2738]Update documentation for Complex da...
Github user sgururajshetty commented on the issue: https://github.com/apache/carbondata/pull/2502 LGTM ---
[GitHub] carbondata issue #2466: [CARBONDATA-2710][Spark Integration] Refactor Carbon...
Github user mohammadshahidkhan commented on the issue: https://github.com/apache/carbondata/pull/2466 retest this please ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2475 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5861/ ---
[GitHub] carbondata issue #2508: [CARBONDATA-2744]Streaming lock is not released even...
Github user zzcclp commented on the issue: https://github.com/apache/carbondata/pull/2508 @BJangir I have one question: If close streaming is called OR streaming Table is updated with 'streaming'='false', it doesn't check 'streaming' property again in method 'addBatch', how to throw an exception? The 'streaming' property is just checked before create 'CarbonAppendableStreamSink' in CarbonSource. ---
[GitHub] carbondata issue #2505: [CARBONDATA-2698][CARBONDATA-2700][CARBONDATA-2732][...
Github user Sssan520 commented on the issue: https://github.com/apache/carbondata/pull/2505 retest this please. ---
[GitHub] carbondata pull request #2508: [CARBONDATA-2744]Streaming lock is not releas...
Github user zzcclp commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2508#discussion_r202566105 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/streaming/StreamSinkFactory.scala --- @@ -78,13 +78,13 @@ object StreamSinkFactory { } } + def createStreamTableSink( sparkSession: SparkSession, hadoopConf: Configuration, carbonTable: CarbonTable, parameters: Map[String, String]): Sink = { --- End diff -- remove blank line ---
[GitHub] carbondata pull request #2508: [CARBONDATA-2744]Streaming lock is not releas...
Github user zzcclp commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2508#discussion_r202565677 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/streaming/StreamSinkFactory.scala --- @@ -78,13 +78,13 @@ object StreamSinkFactory { } } + --- End diff -- remove this blank line. ---
[GitHub] carbondata issue #2476: [CARBONDATA-2534][MV] Fix substring expression not w...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2476 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5860/ ---
[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2477 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5859/ ---
[GitHub] carbondata issue #2505: [CARBONDATA-2698][CARBONDATA-2700][CARBONDATA-2732][...
Github user Sssan520 commented on the issue: https://github.com/apache/carbondata/pull/2505 retest this please. ---
[GitHub] carbondata issue #2506: [CARBONDATA-2682][32K] fix create table with long_st...
Github user Sssan520 commented on the issue: https://github.com/apache/carbondata/pull/2506 retest this please ---
[GitHub] carbondata issue #2478: [CARBONDATA-2540][CARBONDATA-2560][CARBONDATA-2568][...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2478 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5858/ ---
[GitHub] carbondata issue #2307: [CARBONDATA-2482] Pass uuid while writing segment fi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2307 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5976/ ---
[GitHub] carbondata issue #2482: [CARBONDATA-2714] Support merge index files for the ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2482 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5975/ ---
[GitHub] carbondata issue #2479: [CARBONDATA-2542][MV] Fix the mv query from table wi...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2479 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5857/ ---
[GitHub] carbondata issue #2501: [CARBONDATA-2738]Block Preaggregate, Dictionary Excl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2501 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5974/ ---
[GitHub] carbondata issue #2484: [WIP] added hadoop conf to thread local
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2484 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5973/ ---
[GitHub] carbondata issue #2480: [CARBONDATA-2550][CARBONDATA-2576][MV] Fix limit and...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2480 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5856/ ---
[GitHub] carbondata issue #2508: [CARBONDATA-2744]Streaming lock is not released even...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2508 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5972/ ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2474 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5971/ ---
[GitHub] carbondata issue #2479: [CARBONDATA-2542][MV] Fix the mv query from table wi...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2479 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5855/ ---
[GitHub] carbondata issue #2482: [CARBONDATA-2714] Support merge index files for the ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2482 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7201/ ---
[GitHub] carbondata issue #2307: [CARBONDATA-2482] Pass uuid while writing segment fi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2307 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7200/ ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2475 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5970/ ---
[GitHub] carbondata issue #2478: [CARBONDATA-2540][CARBONDATA-2560][CARBONDATA-2568][...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2478 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5854/ ---
[GitHub] carbondata issue #2476: [CARBONDATA-2534][MV] Fix substring expression not w...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2476 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5969/ ---
[GitHub] carbondata issue #2501: [CARBONDATA-2738]Block Preaggregate, Dictionary Excl...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2501 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7199/ ---
[GitHub] carbondata issue #2508: [CARBONDATA-2744]Streaming lock is not released even...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2508 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7198/ ---
[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2477 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5853/ ---
[GitHub] carbondata issue #2484: [WIP] added hadoop conf to thread local
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2484 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7197/ ---
[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2477 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5968/ ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2474 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7196/ ---
[GitHub] carbondata issue #2478: [CARBONDATA-2540][CARBONDATA-2560][CARBONDATA-2568][...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2478 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5967/ ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2475 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5852/ ---
[GitHub] carbondata pull request #2508: [CARBONDATA-2744]Streaming lock is not releas...
GitHub user BJangir opened a pull request: https://github.com/apache/carbondata/pull/2508 [CARBONDATA-2744]Streaming lock is not released even Batch processing⦠Issue Detail :- if Streaming Application is running , DDLs like finish streaming ,close streaming are blocked. ideally DDLs like finish streaming ,close streaming should be blocked if Batch Processing is running. if Batch processing is not happening then DDL's should be allowed from JDBCServer/Beeline. Root Cause :- Streaming lock is taken on application start and it is released onQueryTerminate event of CarbonStreamingQueryListener ,this event will be called when stop() is called on StreamingQuery which means streaming Lock will be released on Either Streaming Query should be terminated Or complete Streaming Application should be terminated till then all stream lock DDL's are blocked. Solution :- on AddBatch take streaming lock and once Batch is processed ,release the streaming lock. Note:- a. If close streaming is called OR streaming Table is updated with 'streaming'='false' and on Trigger time to AddBatch, addBatch will throw Exception and StreamingQuery should be start again. b. if DDLs like finish streaming ,close streaming started 1st and addBatch started 2nd. addBatch will throw "can not acquire lock" Exception and StreamingQuery should be start again. Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? NO - [ ] Any backward compatibility impacted? NO - [ ] Document update required? NO - [ ] Testing done Manually verified below scenarios a. Call StreamingFinished when addBatch is Done. b. Whether New Batch works after Streaming Finish DDL success. it creates new Streaming Segment c. New Batch while Streaming Finnish DDL was running d. Call Streaming Finish when Add Batch was Running Verified above scenarios for other DDL' also (close streaming,set SET TBLPROPERTIES('streaming'='false') - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NO You can merge this pull request into a Git repository by running: $ git pull https://github.com/BJangir/incubator-carbondata streaming_lock Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2508.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2508 commit 2ff88bad23a74ea9b0167574955c4e14c65ca755 Author: BJangir Date: 2018-07-15T16:14:02Z [CARBONDATA-2744]Streaming lock is not released even Batch processing is not happening ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2475 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7195/ ---
[GitHub] carbondata issue #2476: [CARBONDATA-2534][MV] Fix substring expression not w...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2476 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7194/ ---
[jira] [Created] (CARBONDATA-2744) Streaming lock is not released even Batch processing is not happening
Babulal created CARBONDATA-2744: --- Summary: Streaming lock is not released even Batch processing is not happening Key: CARBONDATA-2744 URL: https://issues.apache.org/jira/browse/CARBONDATA-2744 Project: CarbonData Issue Type: Bug Reporter: Babulal if Streaming Application is running , DDLs like finish streaming ,close streaming are blocked. ideally DDLs like finish streaming ,close streaming should be blocked if Batch Processing is running. if Batch processing is not happening then DDL's should be allowed from JDBCServer/Beeline. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2476: [CARBONDATA-2534][MV] Fix substring expression not w...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2476 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5851/ ---
[GitHub] carbondata issue #2477: [CARBONDATA-2539][MV] Fix predicate subquery which u...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2477 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7193/ ---
[jira] [Resolved] (CARBONDATA-2704) Index file size in describe formatted command is not updated correctly with the segment file
[ https://issues.apache.org/jira/browse/CARBONDATA-2704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Manish Gupta resolved CARBONDATA-2704. -- Resolution: Fixed Fix Version/s: 1.4.1 > Index file size in describe formatted command is not updated correctly with > the segment file > > > Key: CARBONDATA-2704 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2704 > Project: CarbonData > Issue Type: Bug >Reporter: dhatchayani >Assignee: dhatchayani >Priority: Minor > Fix For: 1.4.1 > > Time Spent: 6h 40m > Remaining Estimate: 0h > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2462: [CARBONDATA-2704] Index file size in describe...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2462 ---
[GitHub] carbondata issue #2462: [CARBONDATA-2704] Index file size in describe format...
Github user manishgupta88 commented on the issue: https://github.com/apache/carbondata/pull/2462 LGTM ---
[GitHub] carbondata issue #2479: [CARBONDATA-2542][MV] Fix the mv query from table wi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2479 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5966/ ---
[GitHub] carbondata issue #2480: [CARBONDATA-2550][CARBONDATA-2576][MV] Fix limit and...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2480 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5965/ ---
[GitHub] carbondata issue #2478: [CARBONDATA-2540][CARBONDATA-2560][CARBONDATA-2568][...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2478 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7192/ ---
[GitHub] carbondata pull request #2476: [CARBONDATA-2534][MV] Fix substring expressio...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2476#discussion_r202544380 --- Diff: datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVHelper.scala --- @@ -119,7 +119,12 @@ object MVHelper { } def updateColumnName(attr: Attribute): String = { -val name = attr.name.replace("(", "_").replace(")", "").replace(" ", "_").replace("=", "") +val name = + attr.name.replace("(", "_"). --- End diff -- please put `.` to next line ---
[GitHub] carbondata pull request #2475: [CARBONDATA-2531][MV] Fix alias not working o...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2475#discussion_r202544362 --- Diff: datamap/mv/core/src/main/scala/org/apache/carbondata/mv/rewrite/DefaultMatchMaker.scala --- @@ -367,8 +367,10 @@ object GroupbyGroupbyNoChildDelta extends DefaultMatchPattern { if (isGroupingEmR && isGroupingRmE) { val isOutputEmR = gb_2q.outputList.forall { case a @ Alias(_, _) => - gb_2a.outputList.exists{a1 => -a1.isInstanceOf[Alias] && a1.asInstanceOf[Alias].child.semanticEquals(a.child) + gb_2a.outputList.exists{ +case a1: Alias => +a1.child.semanticEquals(a.child) --- End diff -- Is this indentation correct? ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2475 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5850/ ---
[GitHub] carbondata pull request #2474: [CARBONDATA-2530][MV] Disable the MV datamaps...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2474#discussion_r202544328 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/status/DataMapStatusManager.java --- @@ -95,7 +95,9 @@ public static void disableAllLazyDataMaps(CarbonTable table) throws IOException DataMapStoreManager.getInstance().getDataMapSchemasOfTable(table); List dataMapToBeDisabled = new ArrayList<>(allDataMapSchemas.size()); for (DataMapSchema dataMap : allDataMapSchemas) { - if (dataMap.isLazy()) { + // TODO all non datamaps like MV is now supports only lazy. Once the support is made the + // following check can be removed. + if (dataMap.isLazy() || !dataMap.isIndexDataMap()) { --- End diff -- But pre-aggregate datamap supports immediate load with main table. `!dataMap.isIndexDataMap()` returns true for pre-aggregate datamap, right? ---
[GitHub] carbondata pull request #2474: [CARBONDATA-2530][MV] Disable the MV datamaps...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2474#discussion_r202544283 --- Diff: datamap/mv/core/src/main/scala/org/apache/carbondata/mv/datamap/MVHelper.scala --- @@ -328,7 +328,7 @@ object MVHelper { * @return Updated modular plan. */ def updateDataMap(subsumer: ModularPlan, rewrite: QueryRewrite): ModularPlan = { -subsumer match { + subsumer match { --- End diff -- invalid indentation ---
[GitHub] carbondata pull request #2488: [CARBONDATA-2724][DataMap]Unsupported create ...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2488#discussion_r202544231 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -3231,4 +3231,42 @@ private static int unsetLocalDictForComplexColumns(List allColumns } return columnLocalDictGenMap; } + + /** + * This method get the carbon file format version + * + * @param carbonTable + * carbon Table + */ + public static ColumnarFormatVersion getFormatVersion(CarbonTable carbonTable) throws IOException + { +String tablePath = carbonTable.getTablePath(); +CarbonFile[] carbonFiles = FileFactory +.getCarbonFile(tablePath) +.listFiles(new CarbonFileFilter() { + @Override + public boolean accept(CarbonFile file) { +if (file == null) { + return false; +} +return file.getName().endsWith("carbonindex"); --- End diff -- @ravipesala Is there any utility func to get the version from file footer already? ---
[GitHub] carbondata pull request #2488: [CARBONDATA-2724][DataMap]Unsupported create ...
Github user jackylk commented on a diff in the pull request: https://github.com/apache/carbondata/pull/2488#discussion_r202544216 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -3231,4 +3231,42 @@ private static int unsetLocalDictForComplexColumns(List allColumns } return columnLocalDictGenMap; } + + /** + * This method get the carbon file format version + * + * @param carbonTable + * carbon Table + */ + public static ColumnarFormatVersion getFormatVersion(CarbonTable carbonTable) throws IOException + { +String tablePath = carbonTable.getTablePath(); +CarbonFile[] carbonFiles = FileFactory +.getCarbonFile(tablePath) +.listFiles(new CarbonFileFilter() { + @Override + public boolean accept(CarbonFile file) { +if (file == null) { + return false; +} +return file.getName().endsWith("carbonindex"); --- End diff -- I think it is better to get the version from data file instead of index file which is an optional file ---
[GitHub] carbondata issue #2479: [CARBONDATA-2542][MV] Fix the mv query from table wi...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2479 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7191/ ---
[jira] [Resolved] (CARBONDATA-2693) Fix bug for alter rename is renameing the existing table on which bloomfilter datamp exists
[ https://issues.apache.org/jira/browse/CARBONDATA-2693?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li resolved CARBONDATA-2693. -- Resolution: Fixed Fix Version/s: 1.4.1 1.5.0 > Fix bug for alter rename is renameing the existing table on which bloomfilter > datamp exists > --- > > Key: CARBONDATA-2693 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2693 > Project: CarbonData > Issue Type: Sub-task >Reporter: wangsen >Assignee: wangsen >Priority: Major > Fix For: 1.5.0, 1.4.1 > > Attachments: rename.PNG > > Time Spent: 6h 40m > Remaining Estimate: 0h > > 【Detailed description】:Alter rename is renaming the existing table on which > bloomfilter datamap exists > 【Test step】: 1. Create a carbon table > 2. create bloomfilter data map on the table > 3. rename the table using alter > CREATE TABLE datamap_test_join1_sortintStr (id int,name string,salary > float,dob date,doj timestamp,bonus double,status boolean,marks > decimal(10,3))STORED BY 'carbondata' tblproperties('sortcolumns'='id'); > create datamap dm_datamap_test_join1_196_1_intstr on table > datamap_test_join1_sortintStr using 'bloomfilter' > DMPROPERTIES('INDEX_COLUMNS' = 'id,name', 'BLOOM_SIZE'='64', > 'BLOOM_FPP'='0.1', 'BLOOM_COMPRESS'='true'); > alter table datamap_test_join1_sortintStr rename to str1; > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata pull request #2452: [CARBONDATA-2693][BloomDataMap]Fix bug for al...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2452 ---
[GitHub] carbondata issue #2452: [CARBONDATA-2693][BloomDataMap]Fix bug for alter ren...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2452 LGTM ---
[GitHub] carbondata issue #2480: [CARBONDATA-2550][CARBONDATA-2576][MV] Fix limit and...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2480 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7190/ ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2475 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5964/ ---
[GitHub] carbondata issue #2475: [CARBONDATA-2531][MV] Fix alias not working on MV qu...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2475 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7189/ ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2474 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5962/ ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2474 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5849/ ---
[GitHub] carbondata issue #2474: [CARBONDATA-2530][MV] Disable the MV datamaps after ...
Github user CarbonDataQA commented on the issue: https://github.com/apache/carbondata/pull/2474 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7187/ ---
[GitHub] carbondata issue #2307: [CARBONDATA-2482] Pass uuid while writing segment fi...
Github user ravipesala commented on the issue: https://github.com/apache/carbondata/pull/2307 @dhatchayani Build is failed with compilation issues, please fix it ---
[GitHub] carbondata pull request #2453: [CARBONDATA-2528][MV] Fixed order by in mv an...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2453 ---
[GitHub] carbondata pull request #2489: [CARBONDATA-2606][Complex DataType Enhancemen...
Github user asfgit closed the pull request at: https://github.com/apache/carbondata/pull/2489 ---
[jira] [Resolved] (CARBONDATA-2528) MV Datamap - When the MV is created with the order by, then when we execute the corresponding query defined in MV with order by, then the data is not accessed from
[ https://issues.apache.org/jira/browse/CARBONDATA-2528?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jacky Li resolved CARBONDATA-2528. -- Resolution: Fixed Assignee: Ravindra Pesala (was: xubo245) Fix Version/s: 1.4.1 1.5.0 > MV Datamap - When the MV is created with the order by, then when we execute > the corresponding query defined in MV with order by, then the data is not > accessed from the MV. > > > Key: CARBONDATA-2528 > URL: https://issues.apache.org/jira/browse/CARBONDATA-2528 > Project: CarbonData > Issue Type: Bug > Components: data-query > Environment: 3 node Opensource ANT cluster. (Opensource Hadoop 2.7.2+ > Opensource Spark 2.2.1+ Opensource Carbondata 1.3.1) >Reporter: Prasanna Ravichandran >Assignee: Ravindra Pesala >Priority: Minor > Labels: CarbonData, MV, Materialistic_Views > Fix For: 1.5.0, 1.4.1 > > Attachments: MV_orderby.docx, data.csv > > Time Spent: 5h 50m > Remaining Estimate: 0h > > When the MV is created with the order by condition, then when we execute the > corresponding query defined in MV along with order by, then the data is not > accessed from the MV. The data is being accessed from the maintable only. > Test queries: > create datamap MV_order using 'mv' as select > empno,sum(salary)+sum(utilization) as total from originTable group by empno > order by empno; > create datamap MV_desc_order using 'mv' as select > empno,sum(salary)+sum(utilization) as total from originTable group by empno > order by empno DESC; > rebuild datamap MV_order; > rebuild datamap MV_desc_order; > explain select empno,sum(salary)+sum(utilization) as total from originTable > group by empno order by empno; > explain select empno,sum(salary)+sum(utilization) as total from originTable > group by empno order by empno DESC; > Expected result: MV with order by condition should access data from the MV > table only. > > Please see the attached document for more details. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[GitHub] carbondata issue #2453: [CARBONDATA-2528][MV] Fixed order by in mv and aggre...
Github user jackylk commented on the issue: https://github.com/apache/carbondata/pull/2453 LGTM ---