[jira] [Resolved] (HIVE-26773) Update Avro version to 1.10.2
[ https://issues.apache.org/jira/browse/HIVE-26773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhihua Deng resolved HIVE-26773. Fix Version/s: 4.0.0 Resolution: Duplicate > Update Avro version to 1.10.2 > - > > Key: HIVE-26773 > URL: https://issues.apache.org/jira/browse/HIVE-26773 > Project: Hive > Issue Type: Improvement > Components: Avro >Reporter: Zhihua Deng >Priority: Major > Fix For: 4.0.0 > > > Update the avro version to 1.10.2, there is a transitive dependency to > velocity. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (HIVE-26763) Upgrade Zookeeper to latest 3.6.3 and curator to 5.2.0
[ https://issues.apache.org/jira/browse/HIVE-26763?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhihua Deng resolved HIVE-26763. Fix Version/s: 4.0.0 Resolution: Fixed Fix has been merged to master. Thank you for the PR [~amanraj2520] and [~zabetak] for the review! > Upgrade Zookeeper to latest 3.6.3 and curator to 5.2.0 > -- > > Key: HIVE-26763 > URL: https://issues.apache.org/jira/browse/HIVE-26763 > Project: Hive > Issue Type: Improvement > Components: Hive >Reporter: Aman Raj >Assignee: Aman Raj >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > Time Spent: 2h 40m > Remaining Estimate: 0h > > Upgrade Zookeeper to latest version 3.6.3 and curator to latest 5.2.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26763) Upgrade Zookeeper to latest 3.6.3 and curator to 5.2.0
[ https://issues.apache.org/jira/browse/HIVE-26763?focusedWorklogId=828553=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828553 ] ASF GitHub Bot logged work on HIVE-26763: - Author: ASF GitHub Bot Created on: 24/Nov/22 03:05 Start Date: 24/Nov/22 03:05 Worklog Time Spent: 10m Work Description: dengzhhu653 merged PR #3787: URL: https://github.com/apache/hive/pull/3787 Issue Time Tracking --- Worklog Id: (was: 828553) Time Spent: 2h 40m (was: 2.5h) > Upgrade Zookeeper to latest 3.6.3 and curator to 5.2.0 > -- > > Key: HIVE-26763 > URL: https://issues.apache.org/jira/browse/HIVE-26763 > Project: Hive > Issue Type: Improvement > Components: Hive >Reporter: Aman Raj >Assignee: Aman Raj >Priority: Major > Labels: pull-request-available > Time Spent: 2h 40m > Remaining Estimate: 0h > > Upgrade Zookeeper to latest version 3.6.3 and curator to latest 5.2.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-25723) Found some typos
[ https://issues.apache.org/jira/browse/HIVE-25723?focusedWorklogId=828545=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828545 ] ASF GitHub Bot logged work on HIVE-25723: - Author: ASF GitHub Bot Created on: 24/Nov/22 02:36 Start Date: 24/Nov/22 02:36 Worklog Time Spent: 10m Work Description: jsoref commented on PR #2800: URL: https://github.com/apache/hive/pull/2800#issuecomment-1325872682 Sigh Issue Time Tracking --- Worklog Id: (was: 828545) Time Spent: 2.5h (was: 2h 20m) > Found some typos > > > Key: HIVE-25723 > URL: https://issues.apache.org/jira/browse/HIVE-25723 > Project: Hive > Issue Type: Improvement >Affects Versions: All Versions >Reporter: Feng >Priority: Trivial > Labels: pull-request-available > Attachments: DateUtils typo.png, RELEASE_NOTES typo.png > > Original Estimate: 1h > Time Spent: 2.5h > Remaining Estimate: 0h > > I found some typos in DateUtils.java and > RELEASE_NOTES.txt{color:#172b4d}{{}}{color} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26767) Support for custom RDBMS is broken
[ https://issues.apache.org/jira/browse/HIVE-26767?focusedWorklogId=828538=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828538 ] ASF GitHub Bot logged work on HIVE-26767: - Author: ASF GitHub Bot Created on: 24/Nov/22 01:45 Start Date: 24/Nov/22 01:45 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3799: URL: https://github.com/apache/hive/pull/3799#issuecomment-1325843793 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3799) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3799=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3799=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3799=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=CODE_SMELL) [1 Code Smell](https://sonarcloud.io/project/issues?id=apache_hive=3799=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3799=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3799=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828538) Time Spent: 1h 10m (was: 1h) > Support for custom RDBMS is broken > -- > > Key: HIVE-26767 > URL: https://issues.apache.org/jira/browse/HIVE-26767 > Project: Hive > Issue Type: Bug > Components: Metastore >Affects Versions: 4.0.0 >Reporter: Tim Thorpe >Priority: Minor > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > HIVE-24120 introduced code to support custom RDBMS. > DatabaseProduct.getDbType(String productName) will return *DbType.UNDEFINED* > for anything other than the hardcoded/internally supported database types. > When initializing DatabaseProduct with an external/custom RDBMS, it follows > this logic: > > boolean isExternal = MetastoreConf.getBoolVar(conf, > ConfVars.USE_CUSTOM_RDBMS); > if (isExternal) { > // The DatabaseProduct will be created by instantiating an external > class via > // reflection. The external class can override any method in the > current class > String className = MetastoreConf.getVar(conf, > ConfVars.CUSTOM_RDBMS_CLASSNAME); > if (className != null) { > try { > theDatabaseProduct = (DatabaseProduct) > ReflectionUtils.newInstance(Class.forName(className), conf); > LOG.info(String.format("Using custom RDBMS %s", className)); > dbt = DbType.CUSTOM; >
[jira] [Work logged] (HIVE-26537) Deprecate older APIs in the HMS
[ https://issues.apache.org/jira/browse/HIVE-26537?focusedWorklogId=828533=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828533 ] ASF GitHub Bot logged work on HIVE-26537: - Author: ASF GitHub Bot Created on: 24/Nov/22 00:42 Start Date: 24/Nov/22 00:42 Worklog Time Spent: 10m Work Description: saihemanth-cloudera commented on code in PR #3599: URL: https://github.com/apache/hive/pull/3599#discussion_r1030965278 ## standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/HMSHandler.java: ## @@ -1363,18 +1363,62 @@ private void create_database_core(RawStore ms, final Database db) } @Override + @Deprecated public void create_database(final Database db) throws AlreadyExistsException, InvalidObjectException, MetaException { -startFunction("create_database", ": " + db.toString()); +CreateDatabaseRequest req = new CreateDatabaseRequest(db.getName()); +if (db.isSetDescription()) { Review Comment: There is no builder class in the CreateDatabaseRequest thrift object. Issue Time Tracking --- Worklog Id: (was: 828533) Time Spent: 3h 40m (was: 3.5h) > Deprecate older APIs in the HMS > --- > > Key: HIVE-26537 > URL: https://issues.apache.org/jira/browse/HIVE-26537 > Project: Hive > Issue Type: Improvement >Affects Versions: 4.0.0-alpha-1, 4.0.0-alpha-2 >Reporter: Sai Hemanth Gantasala >Assignee: Sai Hemanth Gantasala >Priority: Major > Labels: pull-request-available > Time Spent: 3h 40m > Remaining Estimate: 0h > > This Jira is to track the clean-up(deprecate older APIs and point the HMS > client to the newer APIs) work in the hive metastore server. > More details will be added here soon. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26467) SessionState should be accessible inside ThreadPool
[ https://issues.apache.org/jira/browse/HIVE-26467?focusedWorklogId=828531=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828531 ] ASF GitHub Bot logged work on HIVE-26467: - Author: ASF GitHub Bot Created on: 24/Nov/22 00:23 Start Date: 24/Nov/22 00:23 Worklog Time Spent: 10m Work Description: github-actions[bot] closed pull request #3516: HIVE-26467: SessionState should be accessible inside ThreadPool URL: https://github.com/apache/hive/pull/3516 Issue Time Tracking --- Worklog Id: (was: 828531) Time Spent: 2h (was: 1h 50m) > SessionState should be accessible inside ThreadPool > --- > > Key: HIVE-26467 > URL: https://issues.apache.org/jira/browse/HIVE-26467 > Project: Hive > Issue Type: Improvement >Reporter: Syed Shameerur Rahman >Assignee: Syed Shameerur Rahman >Priority: Major > Labels: pull-request-available > Time Spent: 2h > Remaining Estimate: 0h > > Currently SessionState.get() returns null if it is called inside a > ThreadPool. If there is any custom third party component leverages > SessionState.get() for some operations like getting the session state or > session config inside a thread pool it will result in null since session > state is thread local > (https://github.com/apache/hive/blob/master/ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java#L622) > and ThreadLocal variable are not inheritable to child threads / thread pools. > So one solution is to make the thread local variable inheritable so the > SessionState gets propagated to child threads. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-25723) Found some typos
[ https://issues.apache.org/jira/browse/HIVE-25723?focusedWorklogId=828532=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828532 ] ASF GitHub Bot logged work on HIVE-25723: - Author: ASF GitHub Bot Created on: 24/Nov/22 00:23 Start Date: 24/Nov/22 00:23 Worklog Time Spent: 10m Work Description: github-actions[bot] commented on PR #2800: URL: https://github.com/apache/hive/pull/2800#issuecomment-1325797268 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Feel free to reach out on the d...@hive.apache.org list if the patch is in need of reviews. Issue Time Tracking --- Worklog Id: (was: 828532) Time Spent: 2h 20m (was: 2h 10m) > Found some typos > > > Key: HIVE-25723 > URL: https://issues.apache.org/jira/browse/HIVE-25723 > Project: Hive > Issue Type: Improvement >Affects Versions: All Versions >Reporter: Feng >Priority: Trivial > Labels: pull-request-available > Attachments: DateUtils typo.png, RELEASE_NOTES typo.png > > Original Estimate: 1h > Time Spent: 2h 20m > Remaining Estimate: 0h > > I found some typos in DateUtils.java and > RELEASE_NOTES.txt{color:#172b4d}{{}}{color} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26538) MetastoreDefaultTransformer should revise the location when it's empty
[ https://issues.apache.org/jira/browse/HIVE-26538?focusedWorklogId=828530=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828530 ] ASF GitHub Bot logged work on HIVE-26538: - Author: ASF GitHub Bot Created on: 24/Nov/22 00:23 Start Date: 24/Nov/22 00:23 Worklog Time Spent: 10m Work Description: github-actions[bot] closed pull request #3600: HIVE-26538: MetastoreDefaultTransformer should revise the location wh… URL: https://github.com/apache/hive/pull/3600 Issue Time Tracking --- Worklog Id: (was: 828530) Time Spent: 50m (was: 40m) > MetastoreDefaultTransformer should revise the location when it's empty > -- > > Key: HIVE-26538 > URL: https://issues.apache.org/jira/browse/HIVE-26538 > Project: Hive > Issue Type: Improvement > Components: Standalone Metastore >Reporter: Zhihua Deng >Assignee: Zhihua Deng >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > The table's location is treated as null when it's empty, this takes place > somewhere such as: > [https://github.com/apache/hive/blob/82f319773cb2361a98963e861fb903ab8eecd9c4/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/HMSHandler.java#L2367] > [https://github.com/apache/hive/blob/master/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/MetastoreDefaultTransformer.java#L729] > > MetastoreDefaultTransformer should revise the empty location when > altering/creating tables. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26767) Support for custom RDBMS is broken
[ https://issues.apache.org/jira/browse/HIVE-26767?focusedWorklogId=828528=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828528 ] ASF GitHub Bot logged work on HIVE-26767: - Author: ASF GitHub Bot Created on: 23/Nov/22 23:49 Start Date: 23/Nov/22 23:49 Worklog Time Spent: 10m Work Description: saihemanth-cloudera commented on PR #3799: URL: https://github.com/apache/hive/pull/3799#issuecomment-132549 @tthorpeIBM - This must be an intermittent failure. Just retriggered them again. Issue Time Tracking --- Worklog Id: (was: 828528) Time Spent: 1h (was: 50m) > Support for custom RDBMS is broken > -- > > Key: HIVE-26767 > URL: https://issues.apache.org/jira/browse/HIVE-26767 > Project: Hive > Issue Type: Bug > Components: Metastore >Affects Versions: 4.0.0 >Reporter: Tim Thorpe >Priority: Minor > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > > HIVE-24120 introduced code to support custom RDBMS. > DatabaseProduct.getDbType(String productName) will return *DbType.UNDEFINED* > for anything other than the hardcoded/internally supported database types. > When initializing DatabaseProduct with an external/custom RDBMS, it follows > this logic: > > boolean isExternal = MetastoreConf.getBoolVar(conf, > ConfVars.USE_CUSTOM_RDBMS); > if (isExternal) { > // The DatabaseProduct will be created by instantiating an external > class via > // reflection. The external class can override any method in the > current class > String className = MetastoreConf.getVar(conf, > ConfVars.CUSTOM_RDBMS_CLASSNAME); > if (className != null) { > try { > theDatabaseProduct = (DatabaseProduct) > ReflectionUtils.newInstance(Class.forName(className), conf); > LOG.info(String.format("Using custom RDBMS %s", className)); > dbt = DbType.CUSTOM; > These 2 database types (DbType.UNDEFINED, DbType.CUSTOM) are then compared to > each other to make sure they are the same. > > Preconditions.checkState(theDatabaseProduct.dbType == getDbType(productName)); > > [https://github.com/gatorblue/hive/blob/3a65c6cf9cb552e7c34bfb449a419abfde0a58b6/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/DatabaseProduct.java#L80] > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26455) Remove PowerMockito from hive-exec
[ https://issues.apache.org/jira/browse/HIVE-26455?focusedWorklogId=828503=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828503 ] ASF GitHub Bot logged work on HIVE-26455: - Author: ASF GitHub Bot Created on: 23/Nov/22 21:05 Start Date: 23/Nov/22 21:05 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3798: URL: https://github.com/apache/hive/pull/3798#issuecomment-1325656320 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3798) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [9 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3798=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3798=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828503) Time Spent: 40m (was: 0.5h) > Remove PowerMockito from hive-exec > -- > > Key: HIVE-26455 > URL: https://issues.apache.org/jira/browse/HIVE-26455 > Project: Hive > Issue Type: Improvement > Components: Hive >Reporter: Zsolt Miskolczi >Assignee: Zsolt Miskolczi >Priority: Minor > Labels: pull-request-available > Time Spent: 40m > Remaining Estimate: 0h > > PowerMockito is a mockito extension that introduces some painful points. > The main intention behind that is to be able to do static mocking. Since its > release, mockito-inline has been released, as a part of the mockito-core. > It doesn't require vintage test runner to be able to run and it can mock > objects with their own thread. > The goal is to stop using PowerMockito and use mockito-inline instead. > > The affected packages are: > * org.apache.hadoop.hive.ql.exec.repl > * org.apache.hadoop.hive.ql.exec.repl.bootstrap.load > * org.apache.hadoop.hive.ql.exec.repl.ranger; > * org.apache.hadoop.hive.ql.exec.util > * org.apache.hadoop.hive.ql.parse.repl > * org.apache.hadoop.hive.ql.parse.repl.load.message > * org.apache.hadoop.hive.ql.parse.repl.metric > * org.apache.hadoop.hive.ql.txn.compactor > > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26770) Make "end of loop" compaction logs appear more selectively
[ https://issues.apache.org/jira/browse/HIVE-26770?focusedWorklogId=828457=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828457 ] ASF GitHub Bot logged work on HIVE-26770: - Author: ASF GitHub Bot Created on: 23/Nov/22 18:19 Start Date: 23/Nov/22 18:19 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3803: URL: https://github.com/apache/hive/pull/3803#issuecomment-1325482658 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3803) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3803=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3803=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3803=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=CODE_SMELL) [19 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3803=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3803=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3803=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828457) Time Spent: 20m (was: 10m) > Make "end of loop" compaction logs appear more selectively > -- > > Key: HIVE-26770 > URL: https://issues.apache.org/jira/browse/HIVE-26770 > Project: Hive > Issue Type: Improvement >Reporter: Akshat Mathur >Assignee: Akshat Mathur >Priority: Major > Labels: pull-request-available > Time Spent: 20m > Remaining Estimate: 0h > > Currently Initiator, Worker, and Cleaner threads log something like "finished > one loop" on INFO level. > This is useful to figure out if one of these threads is taking too long to > finish a loop, but expensive in general. > > Suggested Time: 20mins > Logging this should be changed in the following way > # If loop finished within a predefined amount of time, level should be DEBUG > and message should look like: *Initiator loop took \{ellapsedTime} seconds to > finish.* > # If loop ran longer than this predefined amount, level should be WARN and > message should look like: *Possible Initiator slowdown, loop took > \{ellapsedTime} seconds to finish.* -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26764) Show compaction request should have all filds optional
[ https://issues.apache.org/jira/browse/HIVE-26764?focusedWorklogId=828455=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828455 ] ASF GitHub Bot logged work on HIVE-26764: - Author: ASF GitHub Bot Created on: 23/Nov/22 18:08 Start Date: 23/Nov/22 18:08 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3791: URL: https://github.com/apache/hive/pull/3791#issuecomment-1325469115 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3791) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [0 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3791=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3791=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828455) Time Spent: 1h (was: 50m) > Show compaction request should have all filds optional > -- > > Key: HIVE-26764 > URL: https://issues.apache.org/jira/browse/HIVE-26764 > Project: Hive > Issue Type: Bug >Affects Versions: 4.0.0-alpha-2 >Reporter: KIRTI RUGE >Assignee: KIRTI RUGE >Priority: Major > Labels: pull-request-available > Time Spent: 1h > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26509) Introduce dynamic leader election in HMS
[ https://issues.apache.org/jira/browse/HIVE-26509?focusedWorklogId=828450=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828450 ] ASF GitHub Bot logged work on HIVE-26509: - Author: ASF GitHub Bot Created on: 23/Nov/22 17:44 Start Date: 23/Nov/22 17:44 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3567: URL: https://github.com/apache/hive/pull/3567#issuecomment-1325443412 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3567) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=BUG) [![C](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/C-16px.png 'C')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=BUG) [10 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3567=false=SECURITY_HOTSPOT) [![E](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/E-16px.png 'E')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3567=false=SECURITY_HOTSPOT) [1 Security Hotspot](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3567=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=CODE_SMELL) [65 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3567=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3567=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3567=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828450) Time Spent: 6h 10m (was: 6h) > Introduce dynamic leader election in HMS > > > Key: HIVE-26509 > URL: https://issues.apache.org/jira/browse/HIVE-26509 > Project: Hive > Issue Type: New Feature > Components: Standalone Metastore >Reporter: Zhihua Deng >Assignee: Zhihua Deng >Priority: Major > Labels: pull-request-available > Time Spent: 6h 10m > Remaining Estimate: 0h > > From HIVE-21841 we have a leader HMS selected by configuring > metastore.housekeeping.leader.hostname on startup. This approach saves us > from running duplicated HMS's housekeeping tasks cluster-wide. > In this jira, we introduce another dynamic leader election: adopt hive lock > to implement the leader election. Once a HMS owns the lock, then it becomes > the leader, carries out the housekeeping tasks, and sends heartbeats to renew > the lock before timeout. If the leader fails to reclaim the lock, then stops > the already started tasks if it has, the electing event is audited. We can > achieve a more dynamic leader when the original goes down or in the public > cloud without well configured property, and reduce the leader’s burdens by > running these tasks among different leaders. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26770) Make "end of loop" compaction logs appear more selectively
[ https://issues.apache.org/jira/browse/HIVE-26770?focusedWorklogId=828443=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828443 ] ASF GitHub Bot logged work on HIVE-26770: - Author: ASF GitHub Bot Created on: 23/Nov/22 17:22 Start Date: 23/Nov/22 17:22 Worklog Time Spent: 10m Work Description: akshat0395 opened a new pull request, #3803: URL: https://github.com/apache/hive/pull/3803 ### What changes were proposed in this pull request? Make "end of loop" compaction logs appear more selectively and move duplicate code from Compactor threads to base class, more details can be found in the following ticket [HIVE-26770](https://issues.apache.org/jira/browse/HIVE-26770) ### Why are the changes needed? Improved logging for Compactor threads to reduce noise and share time based stats ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Unit tests Issue Time Tracking --- Worklog Id: (was: 828443) Remaining Estimate: 0h Time Spent: 10m > Make "end of loop" compaction logs appear more selectively > -- > > Key: HIVE-26770 > URL: https://issues.apache.org/jira/browse/HIVE-26770 > Project: Hive > Issue Type: Improvement >Reporter: Akshat Mathur >Assignee: Akshat Mathur >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > Currently Initiator, Worker, and Cleaner threads log something like "finished > one loop" on INFO level. > This is useful to figure out if one of these threads is taking too long to > finish a loop, but expensive in general. > > Suggested Time: 20mins > Logging this should be changed in the following way > # If loop finished within a predefined amount of time, level should be DEBUG > and message should look like: *Initiator loop took \{ellapsedTime} seconds to > finish.* > # If loop ran longer than this predefined amount, level should be WARN and > message should look like: *Possible Initiator slowdown, loop took > \{ellapsedTime} seconds to finish.* -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (HIVE-26770) Make "end of loop" compaction logs appear more selectively
[ https://issues.apache.org/jira/browse/HIVE-26770?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HIVE-26770: -- Labels: pull-request-available (was: ) > Make "end of loop" compaction logs appear more selectively > -- > > Key: HIVE-26770 > URL: https://issues.apache.org/jira/browse/HIVE-26770 > Project: Hive > Issue Type: Improvement >Reporter: Akshat Mathur >Assignee: Akshat Mathur >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > Currently Initiator, Worker, and Cleaner threads log something like "finished > one loop" on INFO level. > This is useful to figure out if one of these threads is taking too long to > finish a loop, but expensive in general. > > Suggested Time: 20mins > Logging this should be changed in the following way > # If loop finished within a predefined amount of time, level should be DEBUG > and message should look like: *Initiator loop took \{ellapsedTime} seconds to > finish.* > # If loop ran longer than this predefined amount, level should be WARN and > message should look like: *Possible Initiator slowdown, loop took > \{ellapsedTime} seconds to finish.* -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26771) Use DDLTask to created Iceberg table when running ctas statement
[ https://issues.apache.org/jira/browse/HIVE-26771?focusedWorklogId=828437=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828437 ] ASF GitHub Bot logged work on HIVE-26771: - Author: ASF GitHub Bot Created on: 23/Nov/22 16:23 Start Date: 23/Nov/22 16:23 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3802: URL: https://github.com/apache/hive/pull/3802#issuecomment-1325336744 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3802) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3802=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3802=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3802=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=CODE_SMELL) [1 Code Smell](https://sonarcloud.io/project/issues?id=apache_hive=3802=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3802=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3802=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828437) Time Spent: 20m (was: 10m) > Use DDLTask to created Iceberg table when running ctas statement > > > Key: HIVE-26771 > URL: https://issues.apache.org/jira/browse/HIVE-26771 > Project: Hive > Issue Type: Improvement > Components: Iceberg integration >Reporter: Krisztian Kasa >Assignee: Krisztian Kasa >Priority: Major > Labels: pull-request-available > Time Spent: 20m > Remaining Estimate: 0h > > When Iceberg table is created via ctas statement the table is created in > HiveIcebergSerDe and no DDL task is executed. > Negative effects of this workflow: > * Default privileges of the new table are not granted. > * The new Iceberg table can be seen by other transactions at compile time of > ctas. > * Table creation and table properties are not shown in explain ctas output. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26683) Sum over window produces 0 when row contains null
[ https://issues.apache.org/jira/browse/HIVE-26683?focusedWorklogId=828401=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828401 ] ASF GitHub Bot logged work on HIVE-26683: - Author: ASF GitHub Bot Created on: 23/Nov/22 14:54 Start Date: 23/Nov/22 14:54 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3800: URL: https://github.com/apache/hive/pull/3800#issuecomment-1325198683 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3800) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3800=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3800=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3800=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=CODE_SMELL) [0 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3800=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3800=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3800=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828401) Time Spent: 1h 10m (was: 1h) > Sum over window produces 0 when row contains null > - > > Key: HIVE-26683 > URL: https://issues.apache.org/jira/browse/HIVE-26683 > Project: Hive > Issue Type: Bug > Components: HiveServer2 >Reporter: Steve Carlin >Assignee: Steve Carlin >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > Ran the following sql: > > {code:java} > create table sum_window_test_small (id int, tinyint_col tinyint); > insert into sum_window_test_small values (5,5), (10, NULL), (11,1); > select id, > tinyint_col, > sum(tinyint_col) over (order by id nulls last rows between 1 following and 1 > following) > from sum_window_test_small order by id; > select id, > tinyint_col, > sum(tinyint_col) over (order by id nulls last rows between current row and 1 > following) > from sum_window_test_small order by id; > {code} > The result is > {code:java} > +-+--+---+ > | id | tinyint_col | sum_window_0 | > +-+--+---+ > | 5 | 5 | 0 | > | 10 | NULL | 1 | > | 11 | 1 | NULL | > +-+--+---+{code} > The first row should have the sum as NULL > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26455) Remove PowerMockito from hive-exec
[ https://issues.apache.org/jira/browse/HIVE-26455?focusedWorklogId=828398=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828398 ] ASF GitHub Bot logged work on HIVE-26455: - Author: ASF GitHub Bot Created on: 23/Nov/22 14:49 Start Date: 23/Nov/22 14:49 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3798: URL: https://github.com/apache/hive/pull/3798#issuecomment-1325190007 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3798) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3798=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [9 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3798=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3798=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3798=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828398) Time Spent: 0.5h (was: 20m) > Remove PowerMockito from hive-exec > -- > > Key: HIVE-26455 > URL: https://issues.apache.org/jira/browse/HIVE-26455 > Project: Hive > Issue Type: Improvement > Components: Hive >Reporter: Zsolt Miskolczi >Assignee: Zsolt Miskolczi >Priority: Minor > Labels: pull-request-available > Time Spent: 0.5h > Remaining Estimate: 0h > > PowerMockito is a mockito extension that introduces some painful points. > The main intention behind that is to be able to do static mocking. Since its > release, mockito-inline has been released, as a part of the mockito-core. > It doesn't require vintage test runner to be able to run and it can mock > objects with their own thread. > The goal is to stop using PowerMockito and use mockito-inline instead. > > The affected packages are: > * org.apache.hadoop.hive.ql.exec.repl > * org.apache.hadoop.hive.ql.exec.repl.bootstrap.load > * org.apache.hadoop.hive.ql.exec.repl.ranger; > * org.apache.hadoop.hive.ql.exec.util > * org.apache.hadoop.hive.ql.parse.repl > * org.apache.hadoop.hive.ql.parse.repl.load.message > * org.apache.hadoop.hive.ql.parse.repl.metric > * org.apache.hadoop.hive.ql.txn.compactor > > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26766) ClassCastException on select query on an external table created using JDBCStorageHandler(Mysql) with a date or timestamp column
[ https://issues.apache.org/jira/browse/HIVE-26766?focusedWorklogId=828383=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828383 ] ASF GitHub Bot logged work on HIVE-26766: - Author: ASF GitHub Bot Created on: 23/Nov/22 14:04 Start Date: 23/Nov/22 14:04 Worklog Time Spent: 10m Work Description: vikramahuja1001 closed pull request #3797: [HIVE-26766] Fix ClassCastException with date and timestamp column when querying on an external table created using JDBCStorageHandler URL: https://github.com/apache/hive/pull/3797 Issue Time Tracking --- Worklog Id: (was: 828383) Time Spent: 50m (was: 40m) > ClassCastException on select query on an external table created using > JDBCStorageHandler(Mysql) with a date or timestamp column > --- > > Key: HIVE-26766 > URL: https://issues.apache.org/jira/browse/HIVE-26766 > Project: Hive > Issue Type: Bug >Reporter: Vikram Ahuja >Assignee: Vikram Ahuja >Priority: Minor > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > Select queries are failing on an external table created using > JDBCStorageHandler(using Mysql) if a date or timestamp column is present with > the exception: java.lang.ClassCastException: java.sql.Timestamp cannot be > cast to org.apache.hadoop.hive.common.type.Timestamp > Works fine with other data type -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26766) ClassCastException on select query on an external table created using JDBCStorageHandler(Mysql) with a date or timestamp column
[ https://issues.apache.org/jira/browse/HIVE-26766?focusedWorklogId=828382=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828382 ] ASF GitHub Bot logged work on HIVE-26766: - Author: ASF GitHub Bot Created on: 23/Nov/22 14:04 Start Date: 23/Nov/22 14:04 Worklog Time Spent: 10m Work Description: vikramahuja1001 commented on PR #3797: URL: https://github.com/apache/hive/pull/3797#issuecomment-1325118350 Closing this PR as this issue was fixed previously. Can be tracked here: https://issues.apache.org/jira/browse/HIVE-22433 Issue Time Tracking --- Worklog Id: (was: 828382) Time Spent: 40m (was: 0.5h) > ClassCastException on select query on an external table created using > JDBCStorageHandler(Mysql) with a date or timestamp column > --- > > Key: HIVE-26766 > URL: https://issues.apache.org/jira/browse/HIVE-26766 > Project: Hive > Issue Type: Bug >Reporter: Vikram Ahuja >Assignee: Vikram Ahuja >Priority: Minor > Labels: pull-request-available > Time Spent: 40m > Remaining Estimate: 0h > > Select queries are failing on an external table created using > JDBCStorageHandler(using Mysql) if a date or timestamp column is present with > the exception: java.lang.ClassCastException: java.sql.Timestamp cannot be > cast to org.apache.hadoop.hive.common.type.Timestamp > Works fine with other data type -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (HIVE-26766) ClassCastException on select query on an external table created using JDBCStorageHandler(Mysql) with a date or timestamp column
[ https://issues.apache.org/jira/browse/HIVE-26766?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Vikram Ahuja resolved HIVE-26766. - Resolution: Duplicate > ClassCastException on select query on an external table created using > JDBCStorageHandler(Mysql) with a date or timestamp column > --- > > Key: HIVE-26766 > URL: https://issues.apache.org/jira/browse/HIVE-26766 > Project: Hive > Issue Type: Bug >Reporter: Vikram Ahuja >Assignee: Vikram Ahuja >Priority: Minor > Labels: pull-request-available > Time Spent: 0.5h > Remaining Estimate: 0h > > Select queries are failing on an external table created using > JDBCStorageHandler(using Mysql) if a date or timestamp column is present with > the exception: java.lang.ClassCastException: java.sql.Timestamp cannot be > cast to org.apache.hadoop.hive.common.type.Timestamp > Works fine with other data type -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Commented] (HIVE-22368) Hive JDBC Storage Handler: some mysql data type can not be cast to hive data type
[ https://issues.apache.org/jira/browse/HIVE-22368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17637774#comment-17637774 ] Vikram Ahuja commented on HIVE-22368: - This issue was fixed in https://issues.apache.org/jira/browse/HIVE-22433 > Hive JDBC Storage Handler: some mysql data type can not be cast to hive data > type > - > > Key: HIVE-22368 > URL: https://issues.apache.org/jira/browse/HIVE-22368 > Project: Hive > Issue Type: Bug >Affects Versions: 3.1.0, 3.1.1, 4.0.0 >Reporter: zhangbutao >Assignee: zhangbutao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0-alpha-2 > > Attachments: HIVE-22368.01.patch > > Time Spent: 40m > Remaining Estimate: 0h > > Mysql data type(date、timestamp、decimal)can not be cast to hive data > type(date、timestamp、decimal)。 > step to repo(take date type for example): > {code:java} > //MySQL table: > create table testdate(id date); > //Hive table: > CREATE EXTERNAL TABLE `hive_date`( > id date ) > ROW FORMAT SERDE > 'org.apache.hive.storage.jdbc.JdbcSerDe' > STORED BY > 'org.apache.hive.storage.jdbc.JdbcStorageHandler' > TBLPROPERTIES ( > > 'hive.sql.database.type'='MYSQL', > 'hive.sql.dbcp.password'='hive', > 'hive.sql.dbcp.username'='hive', > 'hive.sql.jdbc.driver'='com.mysql.jdbc.Driver', > 'hive.sql.jdbc.url'='jdbc:mysql://hadoop/test', > 'hive.sql.table'='testdate'); > //Hive query: > select * from hive_date; > Error: java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: > java.lang.ClassCastException: java.sql.Date cannot be cast to > org.apache.hadoop.hive.common.type.Date (state=,code=0) > //Error stack trace > Caused by: java.lang.ClassCastException: java.sql.Date cannot be cast to > org.apache.hadoop.hive.common.type.Date > at > org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaDateObjectInspector.getPrimitiveJavaObject(JavaDateObjectInspector.java:41) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaDateObjectInspector.getPrimitiveJavaObject(JavaDateObjectInspector.java:27) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils.copyToStandardObject(ObjectInspectorUtils.java:422) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.serde2.SerDeUtils.toThriftPayload(SerDeUtils.java:173) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.serde2.thrift.ThriftFormatter.convert(ThriftFormatter.java:49) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.ListSinkOperator.process(ListSinkOperator.java:94) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:928) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:995) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:941) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:519) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:511) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:146) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2706) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at > org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:229) > ~[hive-exec-3.1.0-bc3.0.1.jar:3.1.0-bc3.0.1] > at >
[jira] [Commented] (HIVE-26766) ClassCastException on select query on an external table created using JDBCStorageHandler(Mysql) with a date or timestamp column
[ https://issues.apache.org/jira/browse/HIVE-26766?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17637775#comment-17637775 ] Vikram Ahuja commented on HIVE-26766: - This issue was fixed in : https://issues.apache.org/jira/browse/HIVE-22433 Closing this > ClassCastException on select query on an external table created using > JDBCStorageHandler(Mysql) with a date or timestamp column > --- > > Key: HIVE-26766 > URL: https://issues.apache.org/jira/browse/HIVE-26766 > Project: Hive > Issue Type: Bug >Reporter: Vikram Ahuja >Assignee: Vikram Ahuja >Priority: Minor > Labels: pull-request-available > Time Spent: 0.5h > Remaining Estimate: 0h > > Select queries are failing on an external table created using > JDBCStorageHandler(using Mysql) if a date or timestamp column is present with > the exception: java.lang.ClassCastException: java.sql.Timestamp cannot be > cast to org.apache.hadoop.hive.common.type.Timestamp > Works fine with other data type -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (HIVE-26679) [Hive] Drops archive partitions error
[ https://issues.apache.org/jira/browse/HIVE-26679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zoltán Rátkai resolved HIVE-26679. -- Resolution: Fixed The issue is in TEZ: https://issues.apache.org/jira/browse/TEZ-4415?page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel=17564569 > [Hive] Drops archive partitions error > - > > Key: HIVE-26679 > URL: https://issues.apache.org/jira/browse/HIVE-26679 > Project: Hive > Issue Type: Bug >Reporter: Wang Jiangkun >Assignee: Zoltán Rátkai >Priority: Blocker > Attachments: image-2022-11-10-16-58-45-250.png > > > When the partition table is converted to an archive partition, the drop > partition times are Error. it seems there is an issue occurs, which is > {code:java} > set hive.archive.enabled=true; > alter table tb1 archive partition(city="nanjing");{code} > {code:java} > 2022-10-27 16:55:41,872 ERROR hive.ql.exec.DDLTask: > org.apache.hadoop.hive.ql.metadata.HiveException: Got exception: > java.io.IOException Invalid path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2364) > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2345) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropPartitions(DDLTask.java:3900) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3860) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:368) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2130) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1801) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1206) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1194) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > at > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: MetaException(message:Got exception: java.io.IOException Invalid > path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result.read(ThriftHiveMetastore.java) > at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_drop_partitions_req(ThriftHiveMetastore.java:2081) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.drop_partitions_req(ThriftHiveMetastore.java:2068) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropPartitions(HiveMetaStoreClient.java:1008) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173) > at com.sun.proxy.$Proxy29.dropPartitions(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at >
[jira] [Updated] (HIVE-26771) Use DDLTask to created Iceberg table when running ctas statement
[ https://issues.apache.org/jira/browse/HIVE-26771?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated HIVE-26771: -- Labels: pull-request-available (was: ) > Use DDLTask to created Iceberg table when running ctas statement > > > Key: HIVE-26771 > URL: https://issues.apache.org/jira/browse/HIVE-26771 > Project: Hive > Issue Type: Improvement > Components: Iceberg integration >Reporter: Krisztian Kasa >Assignee: Krisztian Kasa >Priority: Major > Labels: pull-request-available > Time Spent: 10m > Remaining Estimate: 0h > > When Iceberg table is created via ctas statement the table is created in > HiveIcebergSerDe and no DDL task is executed. > Negative effects of this workflow: > * Default privileges of the new table are not granted. > * The new Iceberg table can be seen by other transactions at compile time of > ctas. > * Table creation and table properties are not shown in explain ctas output. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26771) Use DDLTask to created Iceberg table when running ctas statement
[ https://issues.apache.org/jira/browse/HIVE-26771?focusedWorklogId=828380=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828380 ] ASF GitHub Bot logged work on HIVE-26771: - Author: ASF GitHub Bot Created on: 23/Nov/22 13:57 Start Date: 23/Nov/22 13:57 Worklog Time Spent: 10m Work Description: kasakrisz opened a new pull request, #3802: URL: https://github.com/apache/hive/pull/3802 ### What changes were proposed in this pull request? Refactor the way ctas is executed: * Do not create the table in `HiveIcebergSerDe` since it is also created at compile time. * Add a DDLTask before the `TezTask` to create the Iceberg table. * Collect the properties added to jobconfig from the Serde object and location and fileio from HiveCatalog. Location can be calculated at compile time using `SemanticAnalyzer.getCtasOrCMVLocation` * Persist the new table metaobject to a temp file when committing the table creation. * Read back the table metaobject anytime it is required from the TezTask and the MoveTask. ### Why are the changes needed? See jira. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? ``` mvn test -Dtest.output.overwrite -Dtest=TestIcebergCliDriver -Dqfile=ctas_iceberg_partitioned_orc.q -pl itests/qtest-iceberg -Piceberg -Pitests -Drat.skip ``` Issue Time Tracking --- Worklog Id: (was: 828380) Remaining Estimate: 0h Time Spent: 10m > Use DDLTask to created Iceberg table when running ctas statement > > > Key: HIVE-26771 > URL: https://issues.apache.org/jira/browse/HIVE-26771 > Project: Hive > Issue Type: Improvement > Components: Iceberg integration >Reporter: Krisztian Kasa >Assignee: Krisztian Kasa >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > When Iceberg table is created via ctas statement the table is created in > HiveIcebergSerDe and no DDL task is executed. > Negative effects of this workflow: > * Default privileges of the new table are not granted. > * The new Iceberg table can be seen by other transactions at compile time of > ctas. > * Table creation and table properties are not shown in explain ctas output. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Updated] (HIVE-20554) Unable to drop an external table after renaming it.
[ https://issues.apache.org/jira/browse/HIVE-20554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] paco87 updated HIVE-20554: -- Affects Version/s: 3.1.3 > Unable to drop an external table after renaming it. > --- > > Key: HIVE-20554 > URL: https://issues.apache.org/jira/browse/HIVE-20554 > Project: Hive > Issue Type: Bug > Components: Hive, HiveServer2 >Affects Versions: 1.2.0, 2.1.0, 3.1.3 >Reporter: Krishnama Raju K >Priority: Major > > Unable to drop an external partitioned table after renaming it. Getting the > following exception, > > {noformat} > java.sql.BatchUpdateException: Cannot delete or update a parent row: a > foreign key constraint fails ("hive"."PART_COL_STATS", CONSTRAINT > "PART_COL_STATS_FK" FOREIGN KEY ("PART_ID") REFERENCES "PARTITIONS" > ("PART_ID")) > at > com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:2024) > > at com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1449) > at com.jolbox.bonecp.StatementHandle.executeBatch(StatementHandle.java:424) > at > org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeBatch(ParamLoggingPreparedStatement.java:366) > > at > org.datanucleus.store.rdbms.SQLController.processConnectionStatement(SQLController.java:676) > > at > org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:204) > > at > org.datanucleus.store.rdbms.SQLController.getStatementForUpdate(SQLController.java:176) > > at > org.datanucleus.store.rdbms.scostore.JoinMapStore.clearInternal(JoinMapStore.java:900) > > at > org.datanucleus.store.rdbms.scostore.JoinMapStore.clear(JoinMapStore.java:449) > > at org.datanucleus.store.types.wrappers.backed.Map.clear(Map.java:605) > at > org.datanucleus.store.rdbms.mapping.java.MapMapping.preDelete(MapMapping.java:252) > > at > org.datanucleus.store.rdbms.request.DeleteRequest.execute(DeleteRequest.java:193) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteObjectFromTable(RDBMSPersistenceHandler.java:499) > > at > org.datanucleus.store.rdbms.RDBMSPersistenceHandler.deleteObject(RDBMSPersistenceHandler.java:470) > > at > org.datanucleus.state.AbstractStateManager.internalDeletePersistent(AbstractStateManager.java:832) > > at > org.datanucleus.state.StateManagerImpl.deletePersistent(StateManagerImpl.java:4244) > > at > org.datanucleus.ExecutionContextImpl.deleteObjectInternal(ExecutionContextImpl.java:2395) > > at > org.datanucleus.ExecutionContextImpl.deleteObjectWork(ExecutionContextImpl.java:2317) > > at > org.datanucleus.ExecutionContextImpl.deleteObjects(ExecutionContextImpl.java:2209) > > at > org.datanucleus.ExecutionContextThreadedImpl.deleteObjects(ExecutionContextThreadedImpl.java:259) > > at > org.datanucleus.store.query.Query.performDeletePersistentAll(Query.java:2133) > at > org.datanucleus.store.query.AbstractJavaQuery.performDeletePersistentAll(AbstractJavaQuery.java:114) > > at org.datanucleus.store.query.Query.deletePersistentAll(Query.java:2085) > at > org.datanucleus.api.jdo.JDOQuery.deletePersistentInternal(JDOQuery.java:441) > at org.datanucleus.api.jdo.JDOQuery.deletePersistentAll(JDOQuery.java:428) > at > org.apache.hadoop.hive.metastore.ObjectStore.dropPartitionsNoTxn(ObjectStore.java:2421) > > at > org.apache.hadoop.hive.metastore.ObjectStore.dropPartitions(ObjectStore.java:1805) > > at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103) > at com.sun.proxy.$Proxy10.dropPartitions(Unknown Source) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.dropPartitionsAndGetLocations(HiveMetaStore.java:1838) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1673) > > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1859) > > at sun.reflect.GeneratedMethodAccessor110.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147) > > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105) > > at com.sun.proxy.$Proxy12.drop_table_with_environment_context(Unknown Source) > at >
[jira] [Work started] (HIVE-26679) [Hive] Drops archive partitions error
[ https://issues.apache.org/jira/browse/HIVE-26679?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on HIVE-26679 started by Zoltán Rátkai. > [Hive] Drops archive partitions error > - > > Key: HIVE-26679 > URL: https://issues.apache.org/jira/browse/HIVE-26679 > Project: Hive > Issue Type: Bug >Reporter: Wang Jiangkun >Assignee: Zoltán Rátkai >Priority: Blocker > Attachments: image-2022-11-10-16-58-45-250.png > > > When the partition table is converted to an archive partition, the drop > partition times are Error. it seems there is an issue occurs, which is > {code:java} > set hive.archive.enabled=true; > alter table tb1 archive partition(city="nanjing");{code} > {code:java} > 2022-10-27 16:55:41,872 ERROR hive.ql.exec.DDLTask: > org.apache.hadoop.hive.ql.metadata.HiveException: Got exception: > java.io.IOException Invalid path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2364) > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2345) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropPartitions(DDLTask.java:3900) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3860) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:368) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2130) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1801) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1206) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1194) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > at > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: MetaException(message:Got exception: java.io.IOException Invalid > path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result.read(ThriftHiveMetastore.java) > at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_drop_partitions_req(ThriftHiveMetastore.java:2081) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.drop_partitions_req(ThriftHiveMetastore.java:2068) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropPartitions(HiveMetaStoreClient.java:1008) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173) > at com.sun.proxy.$Proxy29.dropPartitions(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at
[jira] [Commented] (HIVE-26679) [Hive] Drops archive partitions error
[ https://issues.apache.org/jira/browse/HIVE-26679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17637749#comment-17637749 ] Zoltán Rátkai commented on HIVE-26679: -- I think the base of the issue is in TEZ. There is an open ticket related to this topic in TEZ: https://issues.apache.org/jira/browse/TEZ-4415?page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel=17564569 > [Hive] Drops archive partitions error > - > > Key: HIVE-26679 > URL: https://issues.apache.org/jira/browse/HIVE-26679 > Project: Hive > Issue Type: Bug >Reporter: Wang Jiangkun >Assignee: Zoltán Rátkai >Priority: Blocker > Attachments: image-2022-11-10-16-58-45-250.png > > > When the partition table is converted to an archive partition, the drop > partition times are Error. it seems there is an issue occurs, which is > {code:java} > set hive.archive.enabled=true; > alter table tb1 archive partition(city="nanjing");{code} > {code:java} > 2022-10-27 16:55:41,872 ERROR hive.ql.exec.DDLTask: > org.apache.hadoop.hive.ql.metadata.HiveException: Got exception: > java.io.IOException Invalid path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2364) > at > org.apache.hadoop.hive.ql.metadata.Hive.dropPartitions(Hive.java:2345) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropPartitions(DDLTask.java:3900) > at > org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3860) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:368) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2130) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1801) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1206) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1194) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > at > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: MetaException(message:Got exception: java.io.IOException Invalid > path for the Har Filesystem. No index file in > har://hdfs-nameservice/hive/tb1/city=nanjing/data.har) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result$drop_partitions_req_resultStandardScheme.read(ThriftHiveMetastore.java) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$drop_partitions_req_result.read(ThriftHiveMetastore.java) > at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_drop_partitions_req(ThriftHiveMetastore.java:2081) > at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.drop_partitions_req(ThriftHiveMetastore.java:2068) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropPartitions(HiveMetaStoreClient.java:1008) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173) > at com.sun.proxy.$Proxy29.dropPartitions(Unknown Source) > at
[jira] [Work logged] (HIVE-26756) Iceberg: Fetch format version from metadata file to avoid conflicts with spark
[ https://issues.apache.org/jira/browse/HIVE-26756?focusedWorklogId=828366=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828366 ] ASF GitHub Bot logged work on HIVE-26756: - Author: ASF GitHub Bot Created on: 23/Nov/22 13:24 Start Date: 23/Nov/22 13:24 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3778: URL: https://github.com/apache/hive/pull/3778#issuecomment-1325065776 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3778) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3778=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3778=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3778=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=CODE_SMELL) [0 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3778=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3778=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3778=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828366) Time Spent: 1.5h (was: 1h 20m) > Iceberg: Fetch format version from metadata file to avoid conflicts with spark > -- > > Key: HIVE-26756 > URL: https://issues.apache.org/jira/browse/HIVE-26756 > Project: Hive > Issue Type: Bug >Reporter: Ayush Saxena >Assignee: Ayush Saxena >Priority: Major > Labels: pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > Spark & other engines don't set the format version for iceberg table in the > HMS properties, which leads to misinterpretation of iceberg format & lead to > wrong query results. > Propose to extract the format version from the metadata file always rather > than relying on the HMS properties. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Assigned] (HIVE-26771) Use DDLTask to created Iceberg table when running ctas statement
[ https://issues.apache.org/jira/browse/HIVE-26771?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Krisztian Kasa reassigned HIVE-26771: - > Use DDLTask to created Iceberg table when running ctas statement > > > Key: HIVE-26771 > URL: https://issues.apache.org/jira/browse/HIVE-26771 > Project: Hive > Issue Type: Improvement > Components: Iceberg integration >Reporter: Krisztian Kasa >Assignee: Krisztian Kasa >Priority: Major > > When Iceberg table is created via ctas statement the table is created in > HiveIcebergSerDe and no DDL task is executed. > Negative effects of this workflow: > * Default privileges of the new table are not granted. > * The new Iceberg table can be seen by other transactions at compile time of > ctas. > * Table creation and table properties are not shown in explain ctas output. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Assigned] (HIVE-26770) Make "end of loop" compaction logs appear more selectively
[ https://issues.apache.org/jira/browse/HIVE-26770?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Akshat Mathur reassigned HIVE-26770: > Make "end of loop" compaction logs appear more selectively > -- > > Key: HIVE-26770 > URL: https://issues.apache.org/jira/browse/HIVE-26770 > Project: Hive > Issue Type: Improvement >Reporter: Akshat Mathur >Assignee: Akshat Mathur >Priority: Major > > Currently Initiator, Worker, and Cleaner threads log something like "finished > one loop" on INFO level. > This is useful to figure out if one of these threads is taking too long to > finish a loop, but expensive in general. > > Suggested Time: 20mins > Logging this should be changed in the following way > # If loop finished within a predefined amount of time, level should be DEBUG > and message should look like: *Initiator loop took \{ellapsedTime} seconds to > finish.* > # If loop ran longer than this predefined amount, level should be WARN and > message should look like: *Possible Initiator slowdown, loop took > \{ellapsedTime} seconds to finish.* -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26767) Support for custom RDBMS is broken
[ https://issues.apache.org/jira/browse/HIVE-26767?focusedWorklogId=828357=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828357 ] ASF GitHub Bot logged work on HIVE-26767: - Author: ASF GitHub Bot Created on: 23/Nov/22 12:48 Start Date: 23/Nov/22 12:48 Worklog Time Spent: 10m Work Description: tthorpeIBM commented on PR #3799: URL: https://github.com/apache/hive/pull/3799#issuecomment-1325007263 The error in testing doesn't look to be anything related to my changes. The error is from post processing on split-16: #!/bin/bash -e # removes all stdout and err for passed tests xmlstarlet ed -L -d 'testsuite/testcase/system-out[count(../failure)=0]' -d 'testsuite/testcase/system-err[count(../failure)=0]' `find . -name 'TEST*xml' -path '*/surefire-reports/*'` # remove all output.txt files find . -name '*output.txt' -path '*/surefire-reports/*' -exec unlink "{}" \; — Shell Script 2s #!/bin/bash -e tar -czf split-16.tgz --files-from <(find . -path '*/surefire-reports/*') — Shell Script 1s #!/bin/bash -e rsync -rltDq --stats split-16.tgz rsync://rsync/data/precommit3.split-16.tgz — Shell Script <1s **/TEST-*.xml — Archive JUnit-formatted test results 5s [2022-11-23T03:26:22.208Z] Recording test results [2022-11-23T03:26:27.071Z] [Checks API] No suitable checks publisher found. Issue Time Tracking --- Worklog Id: (was: 828357) Time Spent: 50m (was: 40m) > Support for custom RDBMS is broken > -- > > Key: HIVE-26767 > URL: https://issues.apache.org/jira/browse/HIVE-26767 > Project: Hive > Issue Type: Bug > Components: Metastore >Affects Versions: 4.0.0 >Reporter: Tim Thorpe >Priority: Minor > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > > HIVE-24120 introduced code to support custom RDBMS. > DatabaseProduct.getDbType(String productName) will return *DbType.UNDEFINED* > for anything other than the hardcoded/internally supported database types. > When initializing DatabaseProduct with an external/custom RDBMS, it follows > this logic: > > boolean isExternal = MetastoreConf.getBoolVar(conf, > ConfVars.USE_CUSTOM_RDBMS); > if (isExternal) { > // The DatabaseProduct will be created by instantiating an external > class via > // reflection. The external class can override any method in the > current class > String className = MetastoreConf.getVar(conf, > ConfVars.CUSTOM_RDBMS_CLASSNAME); > if (className != null) { > try { > theDatabaseProduct = (DatabaseProduct) > ReflectionUtils.newInstance(Class.forName(className), conf); > LOG.info(String.format("Using custom RDBMS %s", className)); > dbt = DbType.CUSTOM; > These 2 database types (DbType.UNDEFINED, DbType.CUSTOM) are then compared to > each other to make sure they are the same. > > Preconditions.checkState(theDatabaseProduct.dbType == getDbType(productName)); > > [https://github.com/gatorblue/hive/blob/3a65c6cf9cb552e7c34bfb449a419abfde0a58b6/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/DatabaseProduct.java#L80] > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26756) Iceberg: Fetch format version from metadata file to avoid conflicts with spark
[ https://issues.apache.org/jira/browse/HIVE-26756?focusedWorklogId=828352=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828352 ] ASF GitHub Bot logged work on HIVE-26756: - Author: ASF GitHub Bot Created on: 23/Nov/22 12:37 Start Date: 23/Nov/22 12:37 Worklog Time Spent: 10m Work Description: ayushtkn commented on code in PR #3778: URL: https://github.com/apache/hive/pull/3778#discussion_r1030390768 ## standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java: ## @@ -2698,6 +2699,13 @@ public Table getTable(GetTableRequest getTableRequest) throws MetaException, TEx } } + private void extractTablePropertiesFromHook(Table t) throws MetaException { Review Comment: Renamed Issue Time Tracking --- Worklog Id: (was: 828352) Time Spent: 1h 20m (was: 1h 10m) > Iceberg: Fetch format version from metadata file to avoid conflicts with spark > -- > > Key: HIVE-26756 > URL: https://issues.apache.org/jira/browse/HIVE-26756 > Project: Hive > Issue Type: Bug >Reporter: Ayush Saxena >Assignee: Ayush Saxena >Priority: Major > Labels: pull-request-available > Time Spent: 1h 20m > Remaining Estimate: 0h > > Spark & other engines don't set the format version for iceberg table in the > HMS properties, which leads to misinterpretation of iceberg format & lead to > wrong query results. > Propose to extract the format version from the metadata file always rather > than relying on the HMS properties. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26756) Iceberg: Fetch format version from metadata file to avoid conflicts with spark
[ https://issues.apache.org/jira/browse/HIVE-26756?focusedWorklogId=828347=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828347 ] ASF GitHub Bot logged work on HIVE-26756: - Author: ASF GitHub Bot Created on: 23/Nov/22 12:24 Start Date: 23/Nov/22 12:24 Worklog Time Spent: 10m Work Description: szlta commented on code in PR #3778: URL: https://github.com/apache/hive/pull/3778#discussion_r1030379645 ## standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java: ## @@ -2698,6 +2699,13 @@ public Table getTable(GetTableRequest getTableRequest) throws MetaException, TEx } } + private void extractTablePropertiesFromHook(Table t) throws MetaException { Review Comment: Why do we need this method? I guess the only reason is to have the hook presence check around postGetTable call. Maybe we could rename this that has nothing to do with properties? Implementors of postGetTable might have a different reason for implementing than property adjustments. Issue Time Tracking --- Worklog Id: (was: 828347) Time Spent: 1h 10m (was: 1h) > Iceberg: Fetch format version from metadata file to avoid conflicts with spark > -- > > Key: HIVE-26756 > URL: https://issues.apache.org/jira/browse/HIVE-26756 > Project: Hive > Issue Type: Bug >Reporter: Ayush Saxena >Assignee: Ayush Saxena >Priority: Major > Labels: pull-request-available > Time Spent: 1h 10m > Remaining Estimate: 0h > > Spark & other engines don't set the format version for iceberg table in the > HMS properties, which leads to misinterpretation of iceberg format & lead to > wrong query results. > Propose to extract the format version from the metadata file always rather > than relying on the HMS properties. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Resolved] (HIVE-26701) Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS
[ https://issues.apache.org/jira/browse/HIVE-26701?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Naveen Gangam resolved HIVE-26701. -- Fix Version/s: 4.0.0 Resolution: Fixed Fix has been merged to master. Thank you for the patch [~dengzh] > Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS > -- > > Key: HIVE-26701 > URL: https://issues.apache.org/jira/browse/HIVE-26701 > Project: Hive > Issue Type: Bug > Components: Standalone Metastore >Reporter: Taraka Rama Rao Lethavadla >Assignee: Zhihua Deng >Priority: Major > Labels: hive-4.0.0-must, pull-request-available > Fix For: 4.0.0 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > We have metrics enabled for database connection pools(3 & 4) used in > TxnHandler. We don't have the same for pools(1 & 2) used by ObjectStore -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26701) Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS
[ https://issues.apache.org/jira/browse/HIVE-26701?focusedWorklogId=828299=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828299 ] ASF GitHub Bot logged work on HIVE-26701: - Author: ASF GitHub Bot Created on: 23/Nov/22 10:17 Start Date: 23/Nov/22 10:17 Worklog Time Spent: 10m Work Description: nrg4878 merged PR #3773: URL: https://github.com/apache/hive/pull/3773 Issue Time Tracking --- Worklog Id: (was: 828299) Time Spent: 1h 40m (was: 1.5h) > Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS > -- > > Key: HIVE-26701 > URL: https://issues.apache.org/jira/browse/HIVE-26701 > Project: Hive > Issue Type: Bug > Components: Standalone Metastore >Reporter: Taraka Rama Rao Lethavadla >Assignee: Zhihua Deng >Priority: Major > Labels: hive-4.0.0-must, pull-request-available > Time Spent: 1h 40m > Remaining Estimate: 0h > > We have metrics enabled for database connection pools(3 & 4) used in > TxnHandler. We don't have the same for pools(1 & 2) used by ObjectStore -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26701) Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS
[ https://issues.apache.org/jira/browse/HIVE-26701?focusedWorklogId=828297=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828297 ] ASF GitHub Bot logged work on HIVE-26701: - Author: ASF GitHub Bot Created on: 23/Nov/22 10:12 Start Date: 23/Nov/22 10:12 Worklog Time Spent: 10m Work Description: nrg4878 commented on code in PR #3773: URL: https://github.com/apache/hive/pull/3773#discussion_r1030250111 ## standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/PersistenceManagerProvider.java: ## @@ -251,11 +277,14 @@ private static PersistenceManagerFactory initPMF(Configuration conf, boolean for if (dsp == null) { pmf = JDOHelper.getPersistenceManagerFactory(dsProp); } else { - try { + String sourceName = forCompactor ? "objectstore-compactor" : "objectstore"; + try (DataSourceProvider.DataSourceNameConfigurator configurator = + new DataSourceProvider.DataSourceNameConfigurator(conf, sourceName)) { Review Comment: Thanks for the clarification. Looks great. Issue Time Tracking --- Worklog Id: (was: 828297) Time Spent: 1.5h (was: 1h 20m) > Enable metrics for Database connection pools(1 & 2) used by ObjectStore in HMS > -- > > Key: HIVE-26701 > URL: https://issues.apache.org/jira/browse/HIVE-26701 > Project: Hive > Issue Type: Bug > Components: Standalone Metastore >Reporter: Taraka Rama Rao Lethavadla >Assignee: Zhihua Deng >Priority: Major > Labels: hive-4.0.0-must, pull-request-available > Time Spent: 1.5h > Remaining Estimate: 0h > > We have metrics enabled for database connection pools(3 & 4) used in > TxnHandler. We don't have the same for pools(1 & 2) used by ObjectStore -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26764) Show compaction request should have all filds optional
[ https://issues.apache.org/jira/browse/HIVE-26764?focusedWorklogId=828296=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828296 ] ASF GitHub Bot logged work on HIVE-26764: - Author: ASF GitHub Bot Created on: 23/Nov/22 10:10 Start Date: 23/Nov/22 10:10 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3791: URL: https://github.com/apache/hive/pull/3791#issuecomment-1324818626 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3791) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3791=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [0 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3791=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3791=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3791=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828296) Time Spent: 50m (was: 40m) > Show compaction request should have all filds optional > -- > > Key: HIVE-26764 > URL: https://issues.apache.org/jira/browse/HIVE-26764 > Project: Hive > Issue Type: Bug >Affects Versions: 4.0.0-alpha-2 >Reporter: KIRTI RUGE >Assignee: KIRTI RUGE >Priority: Major > Labels: pull-request-available > Time Spent: 50m > Remaining Estimate: 0h > -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Work logged] (HIVE-26628) Iceberg table is created when running explain ctas command
[ https://issues.apache.org/jira/browse/HIVE-26628?focusedWorklogId=828276=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-828276 ] ASF GitHub Bot logged work on HIVE-26628: - Author: ASF GitHub Bot Created on: 23/Nov/22 08:58 Start Date: 23/Nov/22 08:58 Worklog Time Spent: 10m Work Description: sonarcloud[bot] commented on PR #3670: URL: https://github.com/apache/hive/pull/3670#issuecomment-1324730350 Kudos, SonarCloud Quality Gate passed! [![Quality Gate passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png 'Quality Gate passed')](https://sonarcloud.io/dashboard?id=apache_hive=3670) [![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png 'Bug')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=BUG) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=BUG) [0 Bugs](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=BUG) [![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png 'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=VULNERABILITY) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=VULNERABILITY) [0 Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=VULNERABILITY) [![Security Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png 'Security Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3670=false=SECURITY_HOTSPOT) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3670=false=SECURITY_HOTSPOT) [0 Security Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive=3670=false=SECURITY_HOTSPOT) [![Code Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png 'Code Smell')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=CODE_SMELL) [![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png 'A')](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=CODE_SMELL) [5 Code Smells](https://sonarcloud.io/project/issues?id=apache_hive=3670=false=CODE_SMELL) [![No Coverage information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png 'No Coverage information')](https://sonarcloud.io/component_measures?id=apache_hive=3670=coverage=list) No Coverage information [![No Duplication information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png 'No Duplication information')](https://sonarcloud.io/component_measures?id=apache_hive=3670=duplicated_lines_density=list) No Duplication information Issue Time Tracking --- Worklog Id: (was: 828276) Time Spent: 10h 10m (was: 10h) > Iceberg table is created when running explain ctas command > -- > > Key: HIVE-26628 > URL: https://issues.apache.org/jira/browse/HIVE-26628 > Project: Hive > Issue Type: Bug > Components: StorageHandler >Reporter: Krisztian Kasa >Assignee: Krisztian Kasa >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > Time Spent: 10h 10m > Remaining Estimate: 0h > > {code} > create table source(a int, b string, c int); > explain > create table tbl_ice stored by iceberg stored as orc tblproperties > ('format-version'='2') as > select a, b, c from source; > create table tbl_ice stored by iceberg stored as orc tblproperties > ('format-version'='2') as > select a, b, c from source; > {code} > {code} > org.apache.hadoop.hive.ql.parse.SemanticException: > org.apache.hadoop.hive.ql.parse.SemanticException: Table already exists: > default.tbl_ice > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:13963) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:12528) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12693) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:460)