[GitHub] [flink] flinkbot edited a comment on issue #9050: [FLINK-13181][Table]Add Builder to build CsvTableSink instances
flinkbot edited a comment on issue #9050: [FLINK-13181][Table]Add Builder to build CsvTableSink instances URL: https://github.com/apache/flink/pull/9050#issuecomment-516422309 ## CI report: * 96ff341529c250d6eb8d002c36eed57101b72e92 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9751: [FLINK-14177] bump curator from 2.12.0 to 4.2.0
flinkbot edited a comment on issue #9751: [FLINK-14177] bump curator from 2.12.0 to 4.2.0 URL: https://github.com/apache/flink/pull/9751#issuecomment-534387866 ## CI report: * 5b73b3d6319447fd2d81b48183d9f22012133a6b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128853990) * 9e0f8dbc4abc0caa82a66dd4b250d2716debb709 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128866590) * 47fc6a0206ce150cc62d5038a288e1534db0f79b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128877814) * 5ca23fc651a91a0f1e097eed533d8804c2ec9235 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128971343) * 36e98a73fd1c05a9e74e71944b96b834a1f86894 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/129028758) * 3647aded3965fffbfe9711934d5393e4937e474c Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/129048588) * e60f8e7674e9b344b4727bdca36fd35114b2df13 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/137701196) * 4acb81bfaecea22e970a0969b13b491afd2c7930 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/137766548) * 9aa13db184aeb69b6d86cde044e08fddc2ae1379 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/137770669) * e04ebce8c00b5b73e7ca7d91ceeeda83341b13ad Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/139656706) * fa8adb67822c4e885f3078ae4e5af41d5e4f5a80 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/139695807) * 1762e68990566dde609eb998ede365ed0949cc64 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141042456) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3594) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-15159) the string of json is mapped to VARCHAR or STRING?
[ https://issues.apache.org/jira/browse/FLINK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16996223#comment-16996223 ] hehuiyuan commented on FLINK-15159: --- Hi [~jark] , i think the string type in the json schema is mapped to STRING in the flink sql types which is easier to understand. > the string of json is mapped to VARCHAR or STRING? > -- > > Key: FLINK-15159 > URL: https://issues.apache.org/jira/browse/FLINK-15159 > Project: Flink > Issue Type: Wish > Components: Documentation >Reporter: hehuiyuan >Priority: Major > Attachments: image-2019-12-09-21-14-08-183.png > > > !image-2019-12-09-21-14-08-183.png|width=356,height=180! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] hehuiyuan commented on issue #9050: [FLINK-13181][Table]Add Builder to build CsvTableSink instances
hehuiyuan commented on issue #9050: [FLINK-13181][Table]Add Builder to build CsvTableSink instances URL: https://github.com/apache/flink/pull/9050#issuecomment-565691363 Has anyone looked at this? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9751: [FLINK-14177] bump curator from 2.12.0 to 4.2.0
flinkbot edited a comment on issue #9751: [FLINK-14177] bump curator from 2.12.0 to 4.2.0 URL: https://github.com/apache/flink/pull/9751#issuecomment-534387866 ## CI report: * 5b73b3d6319447fd2d81b48183d9f22012133a6b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128853990) * 9e0f8dbc4abc0caa82a66dd4b250d2716debb709 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128866590) * 47fc6a0206ce150cc62d5038a288e1534db0f79b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128877814) * 5ca23fc651a91a0f1e097eed533d8804c2ec9235 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/128971343) * 36e98a73fd1c05a9e74e71944b96b834a1f86894 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/129028758) * 3647aded3965fffbfe9711934d5393e4937e474c Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/129048588) * e60f8e7674e9b344b4727bdca36fd35114b2df13 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/137701196) * 4acb81bfaecea22e970a0969b13b491afd2c7930 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/137766548) * 9aa13db184aeb69b6d86cde044e08fddc2ae1379 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/137770669) * e04ebce8c00b5b73e7ca7d91ceeeda83341b13ad Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/139656706) * fa8adb67822c4e885f3078ae4e5af41d5e4f5a80 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/139695807) * 1762e68990566dde609eb998ede365ed0949cc64 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
xuefuz commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565686277 Actually I think otherwise. Version reference isn't assuming any particular version. It's based on the Hive library that's current loaded in the classpath. It's much safer than a magic. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Assigned] (FLINK-15248) FileUtils#compressDirectory behaves buggy when processing relative directory path
[ https://issues.apache.org/jira/browse/FLINK-15248?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hequn Cheng reassigned FLINK-15248: --- Assignee: Wei Zhong > FileUtils#compressDirectory behaves buggy when processing relative directory > path > - > > Key: FLINK-15248 > URL: https://issues.apache.org/jira/browse/FLINK-15248 > Project: Flink > Issue Type: Bug > Components: FileSystems >Affects Versions: 1.10.0 >Reporter: Wei Zhong >Assignee: Wei Zhong >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0 > > Time Spent: 10m > Remaining Estimate: 0h > > _FileUtils#compressDirectory_ behaves buggy when processing relative > directory path. If the path of target directory is a relative path, the > relative path inside the target zip file can not be constructed correctly: > > {code:java} > public static Path compressDirectory(Path directory, Path target) throws > IOException { >FileSystem sourceFs = directory.getFileSystem(); >FileSystem targetFs = target.getFileSystem(); >try (ZipOutputStream out = new ZipOutputStream(targetFs.create(target, > FileSystem.WriteMode.NO_OVERWRITE))) { > addToZip(directory, sourceFs, directory.getParent(), out); >} >return target; > } > private static void addToZip(Path fileOrDirectory, FileSystem fs, Path > rootDir, ZipOutputStream out) throws IOException { >String relativePath = fileOrDirectory.getPath().replace(rootDir.getPath() > + '/', ""); >if (fs.getFileStatus(fileOrDirectory).isDir()) { > out.putNextEntry(new ZipEntry(relativePath + '/')); > > // The containedFile.getPath() returns an absolute path but the rootDir > // could be a relative path or an empty string (if user only specify > the > // directory name as the relative path). In this case when calling this > // method recursively the string replacement at the beginning of it will > // return a wrong result. > for (FileStatus containedFile : fs.listStatus(fileOrDirectory)) { > addToZip(containedFile.getPath(), fs, rootDir, out); > } >} else { > ZipEntry entry = new ZipEntry(relativePath); > out.putNextEntry(entry); > try (FSDataInputStream in = fs.open(fileOrDirectory)) { > IOUtils.copyBytes(in, out, false); > } > out.closeEntry(); >} > }{code} > > Currently PyFlink allows users to upload python library directories and > requirements cached directory, which will be compressed by > _FileUtils#compressDirectory_ eventually. If users specify them via relative > paths, this bug will be triggered and causes those features unavailable. > we can fix this bug by converting the directory path to absolute path in > _FileUtils#compressDirectory_ before calling _addToZip_ method. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (FLINK-15244) FileUtils#deleteDirectoryQuietly will delete files in the symbolic link which point to a directory
[ https://issues.apache.org/jira/browse/FLINK-15244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hequn Cheng reassigned FLINK-15244: --- Assignee: Wei Zhong > FileUtils#deleteDirectoryQuietly will delete files in the symbolic link which > point to a directory > -- > > Key: FLINK-15244 > URL: https://issues.apache.org/jira/browse/FLINK-15244 > Project: Flink > Issue Type: Bug > Components: FileSystems >Affects Versions: 1.10.0 >Reporter: Wei Zhong >Assignee: Wei Zhong >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > _FileUtils.deleteDirectoryQuietly_ will delete files in symbolic link which > point to a directory. Currently the PyFlink uses this method to delete > temporary folders generated during the job submission and python UDF > execution, which contains the symbolic links which may point to users' > libraries and directories in distributed cache. > To resolve this problem we need to check if the directory is symbolic link in > _FileUtils.deleteDirectoryInternal:_ > {code:java} > private static void deleteDirectoryInternal(File directory) throws > IOException { > // **We should check if the directory is symbolic link.** > if (directory.isDirectory()) { > // directory exists and is a directory > // empty the directory first > try { > cleanDirectoryInternal(directory); > } > catch (FileNotFoundException ignored) { > // someone concurrently deleted the directory, nothing to do for us > return; > } > // delete the directory. this fails if the directory is not empty, > meaning > // if new files got concurrently created. we want to fail then. > // if someone else deleted the empty directory concurrently, we don't > mind > // the result is the same for us, after all > Files.deleteIfExists(directory.toPath()); >} >else if (directory.exists()) { > // exists but is file, not directory > // either an error from the caller, or concurrently a file got created > throw new IOException(directory + " is not a directory"); >} >// else: does not exist, which is okay (as if deleted) > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (FLINK-15255) document how to create Hive table from java API and DDL
[ https://issues.apache.org/jira/browse/FLINK-15255?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li closed FLINK-15255. Resolution: Invalid we can't create hive tables yet > document how to create Hive table from java API and DDL > --- > > Key: FLINK-15255 > URL: https://issues.apache.org/jira/browse/FLINK-15255 > Project: Flink > Issue Type: Task > Components: Documentation >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15255) document how to create Hive table from java API and DDL
[ https://issues.apache.org/jira/browse/FLINK-15255?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15255: - Priority: Major (was: Blocker) > document how to create Hive table from java API and DDL > --- > > Key: FLINK-15255 > URL: https://issues.apache.org/jira/browse/FLINK-15255 > Project: Flink > Issue Type: Task > Components: Documentation >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Major > Fix For: 1.10.0, 1.11.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (FLINK-15234) hive table created from flink catalog table shouldn't have null properties in parameters
[ https://issues.apache.org/jira/browse/FLINK-15234?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li closed FLINK-15234. Resolution: Fixed master: 940bdfc41a2d78ba972541c537925d102ccdc87f 1.10: b6373cba04a9f45d6a851684f80785af1f94b489 > hive table created from flink catalog table shouldn't have null properties in > parameters > > > Key: FLINK-15234 > URL: https://issues.apache.org/jira/browse/FLINK-15234 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > Time Spent: 0.5h > Remaining Estimate: 0h > > we store comment of a catalog table in Hive table's parameters. When it's > null, we put a <"comment", null> k-v in the parameters. Hive table doesn't > take null in its params. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (FLINK-15240) is_generic key is missing for Flink table stored in HiveCatalog
[ https://issues.apache.org/jira/browse/FLINK-15240?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li closed FLINK-15240. Resolution: Fixed master: e4cecef07e73dcb7a2990b075b8fc12eaa02845f 1.10: fc0a19d2a12c3454759ad621093cfa3cd68aac15 > is_generic key is missing for Flink table stored in HiveCatalog > --- > > Key: FLINK-15240 > URL: https://issues.apache.org/jira/browse/FLINK-15240 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] bowenli86 closed pull request #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
bowenli86 closed pull request #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Closed] (FLINK-15001) The digest of sub-plan reuse should contain retraction traits for stream physical nodes
[ https://issues.apache.org/jira/browse/FLINK-15001?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Hequn Cheng closed FLINK-15001. --- Resolution: Fixed > The digest of sub-plan reuse should contain retraction traits for stream > physical nodes > --- > > Key: FLINK-15001 > URL: https://issues.apache.org/jira/browse/FLINK-15001 > Project: Flink > Issue Type: Bug > Components: Table SQL / Planner >Affects Versions: 1.9.0, 1.9.1 >Reporter: godfrey he >Assignee: godfrey he >Priority: Major > Labels: pull-request-available > Fix For: 1.9.2, 1.10.0 > > Attachments: image-2019-12-02-10-49-46-916.png, > image-2019-12-02-10-52-01-399.png > > Time Spent: 0.5h > Remaining Estimate: 0h > > This bug is found in [FLINK-14946| > https://issues.apache.org/jira/browse/FLINK-14946]: > The plan for the given sql in [FLINK-14946| > https://issues.apache.org/jira/browse/FLINK-14946] is > !image-2019-12-02-10-49-46-916.png! > however, the plan after sub-plan reuse is: > !image-2019-12-02-10-52-01-399.png! > in the first picture, we could find that the accMode of two joins are > different, but the two joins are reused in the second picture. > The reason is the digest of sub-plan reuse does not contain retraction traits > for stream physical nodes now. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-15001) The digest of sub-plan reuse should contain retraction traits for stream physical nodes
[ https://issues.apache.org/jira/browse/FLINK-15001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16996111#comment-16996111 ] Hequn Cheng commented on FLINK-15001: - Fixed in 1.9.2 via 9ea04933ee8294da757b5ebc73611cfc3f2a4915 > The digest of sub-plan reuse should contain retraction traits for stream > physical nodes > --- > > Key: FLINK-15001 > URL: https://issues.apache.org/jira/browse/FLINK-15001 > Project: Flink > Issue Type: Bug > Components: Table SQL / Planner >Affects Versions: 1.9.0, 1.9.1 >Reporter: godfrey he >Assignee: godfrey he >Priority: Major > Labels: pull-request-available > Fix For: 1.9.2, 1.10.0 > > Attachments: image-2019-12-02-10-49-46-916.png, > image-2019-12-02-10-52-01-399.png > > Time Spent: 0.5h > Remaining Estimate: 0h > > This bug is found in [FLINK-14946| > https://issues.apache.org/jira/browse/FLINK-14946]: > The plan for the given sql in [FLINK-14946| > https://issues.apache.org/jira/browse/FLINK-14946] is > !image-2019-12-02-10-49-46-916.png! > however, the plan after sub-plan reuse is: > !image-2019-12-02-10-52-01-399.png! > in the first picture, we could find that the accMode of two joins are > different, but the two joins are reused in the second picture. > The reason is the digest of sub-plan reuse does not contain retraction traits > for stream physical nodes now. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) * 83fbaee6acd5ad1d746f95204ada46d35a484455 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141032967) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3593) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) * 83fbaee6acd5ad1d746f95204ada46d35a484455 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141032967) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3593) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15246) Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] not equal to TableSink schema: [EXPR$0: TIMESTAMP(3)]
[ https://issues.apache.org/jira/browse/FLINK-15246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kurt Young updated FLINK-15246: --- Component/s: (was: Table SQL / Client) Table SQL / Planner > Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] not equal to TableSink > schema:[EXPR$0: TIMESTAMP(3)] > --- > > Key: FLINK-15246 > URL: https://issues.apache.org/jira/browse/FLINK-15246 > Project: Flink > Issue Type: Bug > Components: Table SQL / Planner >Affects Versions: 1.10.0 >Reporter: xiaojin.wy >Priority: Major > Fix For: 1.10.0 > > > When I excute the sql below and check the result of it, the "Query result > schema" is not equal to the "TableSink schema"; > > > The sql is: > CREATE TABLE `t` ( > x INT > ) WITH ( > 'format.field-delimiter'=',', > 'connector.type'='filesystem', > 'format.derive-schema'='true', > > 'connector.path'='/defender_test_data/daily_regression_batch_spark_1.10/test_case_when_coercion/sources/t.csv', > 'format.type'='csv' > ); > SELECT CASE WHEN true THEN cast('2017-12-12 09:30:00.0' as timestamp) ELSE > cast(2 as tinyint) END FROM t; > > The exception is: > org.apache.flink.table.api.ValidationException: Field types of query result > and registered TableSink > `default_catalog`.`default_database`.`_tmp_table_443938765` do not match. > Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] TableSink schema: > [EXPR$0: TIMESTAMP(3)] > > The input data is: > 1 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15246) Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] not equal to TableSink schema: [EXPR$0: TIMESTAMP(3)]
[ https://issues.apache.org/jira/browse/FLINK-15246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kurt Young updated FLINK-15246: --- Fix Version/s: 1.10.0 > Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] not equal to TableSink > schema:[EXPR$0: TIMESTAMP(3)] > --- > > Key: FLINK-15246 > URL: https://issues.apache.org/jira/browse/FLINK-15246 > Project: Flink > Issue Type: Bug > Components: Table SQL / Client >Affects Versions: 1.10.0 >Reporter: xiaojin.wy >Priority: Major > Fix For: 1.10.0 > > > When I excute the sql below and check the result of it, the "Query result > schema" is not equal to the "TableSink schema"; > > > The sql is: > CREATE TABLE `t` ( > x INT > ) WITH ( > 'format.field-delimiter'=',', > 'connector.type'='filesystem', > 'format.derive-schema'='true', > > 'connector.path'='/defender_test_data/daily_regression_batch_spark_1.10/test_case_when_coercion/sources/t.csv', > 'format.type'='csv' > ); > SELECT CASE WHEN true THEN cast('2017-12-12 09:30:00.0' as timestamp) ELSE > cast(2 as tinyint) END FROM t; > > The exception is: > org.apache.flink.table.api.ValidationException: Field types of query result > and registered TableSink > `default_catalog`.`default_database`.`_tmp_table_443938765` do not match. > Query result schema: [EXPR$0: TIMESTAMP(6) NOT NULL] TableSink schema: > [EXPR$0: TIMESTAMP(3)] > > The input data is: > 1 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
flinkbot edited a comment on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578#issuecomment-565660409 ## CI report: * c959d8f96d656c8e74dbda3058fed74991f70da8 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141031507) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3592) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) * 83fbaee6acd5ad1d746f95204ada46d35a484455 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141032967) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3593) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration
KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration URL: https://github.com/apache/flink/pull/10538#discussion_r357885687 ## File path: tools/travis/splits/split_container_hadoopfree.sh ## @@ -0,0 +1,53 @@ +#!/usr/bin/env bash + +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +END_TO_END_DIR="`dirname \"$0\"`" # relative +END_TO_END_DIR="`( cd \"$END_TO_END_DIR\" && pwd -P)`" # absolutized and normalized +if [ -z "$END_TO_END_DIR" ] ; then +# error; for some reason, the path is not accessible +# to the script (e.g. permissions re-evaled after suid) +exit 1 # fail +fi + +export END_TO_END_DIR + +if [ -z "$FLINK_DIR" ] ; then +echo "You have to export the Flink distribution directory as FLINK_DIR" +exit 1 +fi + +source "${END_TO_END_DIR}/test-scripts/test-runner-common.sh" + +FLINK_DIR="`( cd \"$FLINK_DIR\" && pwd -P)`" # absolutized and normalized + +echo "flink-end-to-end-test directory: $END_TO_END_DIR" +echo "Flink distribution directory: $FLINK_DIR" + +# Template for adding a test: + +# run_test "" "$END_TO_END_DIR/test-scripts/" + +run_test "Wordcount on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_docker_embedded_job.sh dummy-fs" +# Disabled because of https://issues.apache.org/jira/browse/FLINK-14834 +# run_test "Running Kerberized YARN on Docker test (default input)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh" +# run_test "Running Kerberized YARN on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh dummy-fs" +run_test "Run kubernetes test" "$END_TO_END_DIR/test-scripts/test_kubernetes_embedded_job.sh" Review comment: @tillrohrmann I think we could add a config for HADOOP_USER_NAME and let users set it by themselves. So that, we could be free from the dependency of org.apache.hadoop.security.UserGroupInformation. WDYT? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
flinkbot edited a comment on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578#issuecomment-565660409 ## CI report: * c959d8f96d656c8e74dbda3058fed74991f70da8 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141031507) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3592) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) * 83fbaee6acd5ad1d746f95204ada46d35a484455 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration
KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration URL: https://github.com/apache/flink/pull/10538#discussion_r357882638 ## File path: tools/travis/splits/split_container_hadoopfree.sh ## @@ -0,0 +1,53 @@ +#!/usr/bin/env bash + +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +END_TO_END_DIR="`dirname \"$0\"`" # relative +END_TO_END_DIR="`( cd \"$END_TO_END_DIR\" && pwd -P)`" # absolutized and normalized +if [ -z "$END_TO_END_DIR" ] ; then +# error; for some reason, the path is not accessible +# to the script (e.g. permissions re-evaled after suid) +exit 1 # fail +fi + +export END_TO_END_DIR + +if [ -z "$FLINK_DIR" ] ; then +echo "You have to export the Flink distribution directory as FLINK_DIR" +exit 1 +fi + +source "${END_TO_END_DIR}/test-scripts/test-runner-common.sh" + +FLINK_DIR="`( cd \"$FLINK_DIR\" && pwd -P)`" # absolutized and normalized + +echo "flink-end-to-end-test directory: $END_TO_END_DIR" +echo "Flink distribution directory: $FLINK_DIR" + +# Template for adding a test: + +# run_test "" "$END_TO_END_DIR/test-scripts/" + +run_test "Wordcount on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_docker_embedded_job.sh dummy-fs" +# Disabled because of https://issues.apache.org/jira/browse/FLINK-14834 +# run_test "Running Kerberized YARN on Docker test (default input)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh" +# run_test "Running Kerberized YARN on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh dummy-fs" +run_test "Run kubernetes test" "$END_TO_END_DIR/test-scripts/test_kubernetes_embedded_job.sh" Review comment: Also found hadoop usage in the same code path of release 1.9. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration
KarmaGYZ commented on a change in pull request #10538: [FLINK-15135][e2e][Mesos] Adding e2e tests for Flink's Mesos integration URL: https://github.com/apache/flink/pull/10538#discussion_r357882638 ## File path: tools/travis/splits/split_container_hadoopfree.sh ## @@ -0,0 +1,53 @@ +#!/usr/bin/env bash + +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +END_TO_END_DIR="`dirname \"$0\"`" # relative +END_TO_END_DIR="`( cd \"$END_TO_END_DIR\" && pwd -P)`" # absolutized and normalized +if [ -z "$END_TO_END_DIR" ] ; then +# error; for some reason, the path is not accessible +# to the script (e.g. permissions re-evaled after suid) +exit 1 # fail +fi + +export END_TO_END_DIR + +if [ -z "$FLINK_DIR" ] ; then +echo "You have to export the Flink distribution directory as FLINK_DIR" +exit 1 +fi + +source "${END_TO_END_DIR}/test-scripts/test-runner-common.sh" + +FLINK_DIR="`( cd \"$FLINK_DIR\" && pwd -P)`" # absolutized and normalized + +echo "flink-end-to-end-test directory: $END_TO_END_DIR" +echo "Flink distribution directory: $FLINK_DIR" + +# Template for adding a test: + +# run_test "" "$END_TO_END_DIR/test-scripts/" + +run_test "Wordcount on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_docker_embedded_job.sh dummy-fs" +# Disabled because of https://issues.apache.org/jira/browse/FLINK-14834 +# run_test "Running Kerberized YARN on Docker test (default input)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh" +# run_test "Running Kerberized YARN on Docker test (custom fs plugin)" "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh dummy-fs" +run_test "Run kubernetes test" "$END_TO_END_DIR/test-scripts/test_kubernetes_embedded_job.sh" Review comment: Also found hadoop usage in the similar code path of release 1.9. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
bowenli86 edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565650334 > The changes look fine, but could you explain why we cannot infer the version when it's missing? We did the familiar thing when loading hiveshims. good question. I'm thinking we should remove the version inference in HiveCatalog too. We support lots of hive versions now, users may not read the doc carefully and may assume we can just fit into their hive version magically somehow. Say they use the default hive version 2.3.4 but their actual version is 1.2.x, they may run into issues and the errors won't show the root cause. it's also a bit misleading since the current hive catalog's default version 2.3.4 is not the most widely used one. Most of the time, users always have to change it to 2.2.x, 2.1.x, or 1.2.x. What do you think? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
flinkbot commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578#issuecomment-565660409 ## CI report: * c959d8f96d656c8e74dbda3058fed74991f70da8 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565633971 ## CI report: * 61768f2dc8663fa4c062bf52426eda40955e85f1 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141023436) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3590) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
bowenli86 commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578#issuecomment-565656514 @xuefuz @lirui-apache @JingsongLi @zjuwangg This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
flinkbot commented on issue #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578#issuecomment-565656410 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit c959d8f96d656c8e74dbda3058fed74991f70da8 (Sat Dec 14 00:09:59 UTC 2019) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
[ https://issues.apache.org/jira/browse/FLINK-15259?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15259: - Fix Version/s: 1.9.2 > HiveInspector.toInspectors() should convert Flink constant to Hive constant > > > Key: FLINK-15259 > URL: https://issues.apache.org/jira/browse/FLINK-15259 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Affects Versions: 1.9.0, 1.10.0 >Reporter: Bowen Li >Assignee: Rui Li >Priority: Major > Fix For: 1.9.2, 1.10.0, 1.11.0 > > > repro test: > {code:java} > public class HiveModuleITCase { > @Test > public void test() { > TableEnvironment tEnv = > HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); > tEnv.unloadModule("core"); > tEnv.loadModule("hive", new HiveModule("2.3.4")); > tEnv.sqlQuery("select concat('an', 'bn')"); > } > } > {code} > seems that currently HiveInspector.toInspectors() didn't convert Flink > constant to Hive constant before calling > hiveShim.getObjectInspectorForConstant > I don't think it's a blocker -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
[ https://issues.apache.org/jira/browse/FLINK-15259?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15259: - Affects Version/s: 1.10.0 > HiveInspector.toInspectors() should convert Flink constant to Hive constant > > > Key: FLINK-15259 > URL: https://issues.apache.org/jira/browse/FLINK-15259 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Affects Versions: 1.10.0 >Reporter: Bowen Li >Assignee: Rui Li >Priority: Major > Fix For: 1.10.0, 1.11.0 > > > repro test: > {code:java} > public class HiveModuleITCase { > @Test > public void test() { > TableEnvironment tEnv = > HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); > tEnv.unloadModule("core"); > tEnv.loadModule("hive", new HiveModule("2.3.4")); > tEnv.sqlQuery("select concat('an', 'bn')"); > } > } > {code} > seems that currently HiveInspector.toInspectors() didn't convert Flink > constant to Hive constant before calling > hiveShim.getObjectInspectorForConstant > I don't think it's a blocker -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
[ https://issues.apache.org/jira/browse/FLINK-15259?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15259: - Description: repro test: {code:java} public class HiveModuleITCase { @Test public void test() { TableEnvironment tEnv = HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); tEnv.unloadModule("core"); tEnv.loadModule("hive", new HiveModule("2.3.4")); tEnv.sqlQuery("select concat('an', 'bn')"); } } {code} seems that currently HiveInspector.toInspectors() didn't convert Flink constant to Hive constant before calling hiveShim.getObjectInspectorForConstant I don't think it's a blocker was: repro test: {code:java} public class HiveModuleITCase { @Test public void test() { TableEnvironment tEnv = HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); tEnv.unloadModule("core"); tEnv.loadModule("hive", new HiveModule("2.3.4")); tEnv.sqlQuery("select concat('an', 'bn')"); } } {code} seems that currently HiveInspector.toInspectors() didn't convert Flink constant to Hive constant before calling hiveShim.getObjectInspectorForConstant > HiveInspector.toInspectors() should convert Flink constant to Hive constant > > > Key: FLINK-15259 > URL: https://issues.apache.org/jira/browse/FLINK-15259 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Rui Li >Priority: Major > Fix For: 1.10.0, 1.11.0 > > > repro test: > {code:java} > public class HiveModuleITCase { > @Test > public void test() { > TableEnvironment tEnv = > HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); > tEnv.unloadModule("core"); > tEnv.loadModule("hive", new HiveModule("2.3.4")); > tEnv.sqlQuery("select concat('an', 'bn')"); > } > } > {code} > seems that currently HiveInspector.toInspectors() didn't convert Flink > constant to Hive constant before calling > hiveShim.getObjectInspectorForConstant > I don't think it's a blocker -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
[ https://issues.apache.org/jira/browse/FLINK-15259?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15259: - Affects Version/s: 1.9.0 > HiveInspector.toInspectors() should convert Flink constant to Hive constant > > > Key: FLINK-15259 > URL: https://issues.apache.org/jira/browse/FLINK-15259 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Affects Versions: 1.9.0, 1.10.0 >Reporter: Bowen Li >Assignee: Rui Li >Priority: Major > Fix For: 1.10.0, 1.11.0 > > > repro test: > {code:java} > public class HiveModuleITCase { > @Test > public void test() { > TableEnvironment tEnv = > HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); > tEnv.unloadModule("core"); > tEnv.loadModule("hive", new HiveModule("2.3.4")); > tEnv.sqlQuery("select concat('an', 'bn')"); > } > } > {code} > seems that currently HiveInspector.toInspectors() didn't convert Flink > constant to Hive constant before calling > hiveShim.getObjectInspectorForConstant > I don't think it's a blocker -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should preserve order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-15254: --- Labels: pull-request-available (was: ) > modules in SQL CLI yaml should preserve order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] > update: the root cause is that modules in SQL CLI yaml aren't handled in > order right now, which they should be -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
[ https://issues.apache.org/jira/browse/FLINK-15259?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li reassigned FLINK-15259: Assignee: Rui Li (was: Bowen Li) > HiveInspector.toInspectors() should convert Flink constant to Hive constant > > > Key: FLINK-15259 > URL: https://issues.apache.org/jira/browse/FLINK-15259 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Rui Li >Priority: Major > Fix For: 1.10.0, 1.11.0 > > > repro test: > {code:java} > public class HiveModuleITCase { > @Test > public void test() { > TableEnvironment tEnv = > HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); > tEnv.unloadModule("core"); > tEnv.loadModule("hive", new HiveModule("2.3.4")); > tEnv.sqlQuery("select concat('an', 'bn')"); > } > } > {code} > seems that currently HiveInspector.toInspectors() didn't convert Flink > constant to Hive constant before calling > hiveShim.getObjectInspectorForConstant -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] bowenli86 opened a new pull request #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order
bowenli86 opened a new pull request #10578: [FLINK-15254][sql cli][module] modules in SQL CLI yaml should preserve order URL: https://github.com/apache/flink/pull/10578 ## What is the purpose of the change currently the module map is a hash map in sql cli, which doesn't preserve module loading order from yaml. fix it by always using a linked hash map ## Brief change log - always using a linked hash map in sql cli to handle module order - added UT ## Verifying this change This change added tests and can be verified as follows: `EnvironmentTest.testModuleOrder` ## Does this pull request potentially affect one of the following parts: n/a ## Documentation n/a This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should preserve order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15254: - Issue Type: Bug (was: Test) > modules in SQL CLI yaml should preserve order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] > update: the root cause is that modules in SQL CLI yaml aren't handled in > order right now, which they should be -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should preserve order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15254: - Summary: modules in SQL CLI yaml should preserve order (was: modules in SQL CLI yaml should always be handled in order) > modules in SQL CLI yaml should preserve order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Test > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] > update: the root cause is that modules in SQL CLI yaml aren't handled in > order right now, which they should be -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should always be handled in order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15254: - Description: there seems to be a problem when a hive module is named "hive" and the module cannot be loaded/used properly. reported by [~Terry1897] update: the root cause is that modules in SQL CLI yaml aren't handled in order right now, which they should be was: there seems to be a problem when a hive module is named "hive" and the module cannot be loaded/used properly. reported by [~Terry1897] update: the root cause is that > modules in SQL CLI yaml should always be handled in order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Test > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] > update: the root cause is that modules in SQL CLI yaml aren't handled in > order right now, which they should be -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565625223 ## CI report: * 2e2ecdaa06d1efa17ceed187ccb310168aadb5e1 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141020595) Azure: [SUCCESS](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3589) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565633971 ## CI report: * 61768f2dc8663fa4c062bf52426eda40955e85f1 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141023436) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3590) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should always be handled in order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15254: - Description: there seems to be a problem when a hive module is named "hive" and the module cannot be loaded/used properly. reported by [~Terry1897] update: the root cause is that was:there seems to be a problem when a hive module is named "hive" and the module cannot be loaded/used properly. reported by [~Terry1897] > modules in SQL CLI yaml should always be handled in order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Test > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] > update: the root cause is that -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15254) modules in SQL CLI yaml should always be handled in order
[ https://issues.apache.org/jira/browse/FLINK-15254?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15254: - Summary: modules in SQL CLI yaml should always be handled in order (was: hive module cannot be named "hive") > modules in SQL CLI yaml should always be handled in order > - > > Key: FLINK-15254 > URL: https://issues.apache.org/jira/browse/FLINK-15254 > Project: Flink > Issue Type: Test > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > there seems to be a problem when a hive module is named "hive" and the module > cannot be loaded/used properly. reported by [~Terry1897] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565650641 @flinkbot run azure This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
bowenli86 commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565650334 > The changes look fine, but could you explain why we cannot infer the version when it's missing? We did the familiar thing when loading hiveshims. good question. I'm thinking we should remove the version inference in HiveCatalog too. We support lots of hive versions now, users may not read the doc carefully and may assume we can just fit into their hive version magically somehow. Say they use the default hive version 2.3.4 but their actual version is 1.2.x, they may run into issues and the errors won't show the root cause. What do you think? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565625223 ## CI report: * 2e2ecdaa06d1efa17ceed187ccb310168aadb5e1 Travis: [SUCCESS](https://travis-ci.com/flink-ci/flink/builds/141020595) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3589) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
xuefuz commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565644081 The changes look fine, but could you explain why we cannot infer the version when it's missing? We did the familiar thing when loading hiveshims. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
flinkbot edited a comment on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565633971 ## CI report: * 61768f2dc8663fa4c062bf52426eda40955e85f1 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141023436) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3590) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565638670 > The negative test case can be just to create a catalog table with null property value and expect an exception somewhere. to remove the possibility of null values, this PR made catalog table to reject properties map if it contains null value, so we can't build such a case anymore This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565637937 @flinkbot run travis @flinkbot run azure This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
xuefuz edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565637732 The negative test case can be just to attempt creating a hive table with null property value and expect an exception somewhere. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
xuefuz commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565637732 The negative test case can be just to create a catalog table with null property value and expect an exception somewhere. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-15259) HiveInspector.toInspectors() should convert Flink constant to Hive constant
Bowen Li created FLINK-15259: Summary: HiveInspector.toInspectors() should convert Flink constant to Hive constant Key: FLINK-15259 URL: https://issues.apache.org/jira/browse/FLINK-15259 Project: Flink Issue Type: Bug Components: Connectors / Hive Reporter: Bowen Li Assignee: Bowen Li Fix For: 1.10.0, 1.11.0 repro test: {code:java} public class HiveModuleITCase { @Test public void test() { TableEnvironment tEnv = HiveTestUtils.createTableEnvWithBlinkPlannerBatchMode(); tEnv.unloadModule("core"); tEnv.loadModule("hive", new HiveModule("2.3.4")); tEnv.sqlQuery("select concat('an', 'bn')"); } } {code} seems that currently HiveInspector.toInspectors() didn't convert Flink constant to Hive constant before calling hiveShim.getObjectInspectorForConstant -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
flinkbot commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565633971 ## CI report: * 61768f2dc8663fa4c062bf52426eda40955e85f1 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
flinkbot edited a comment on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565625223 ## CI report: * 2e2ecdaa06d1efa17ceed187ccb310168aadb5e1 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141020595) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3589) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters
bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table shouldn't have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565631320 > PR looks good. Just a couple of minor points: > > 1. Can you change the JIRA title, cannot=>shouldn't > 2. Create a negative test case covering the case if property map has a null valued key. >Thanks. Thanks @xuefuz ! I updated the jira title. For negative test case, as we have set the strict check in AbstractCatalogTable for null-valued properties and also fixed the 'comment' bug, the only negative case I can think of is to use Hive's metastore client to send a test catalog table impl which contains null-value, but it doesn't seems add much value to us. What negative case are you thinking of? I'll merge this PR first to unblock community from testing hivecatalog. Will add negative cases later if you have any in mind This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15234) hive table created from flink catalog table shouldn't have null properties in parameters
[ https://issues.apache.org/jira/browse/FLINK-15234?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15234: - Summary: hive table created from flink catalog table shouldn't have null properties in parameters (was: hive table created from flink catalog table cannot have null properties in parameters) > hive table created from flink catalog table shouldn't have null properties in > parameters > > > Key: FLINK-15234 > URL: https://issues.apache.org/jira/browse/FLINK-15234 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > Time Spent: 20m > Remaining Estimate: 0h > > we store comment of a catalog table in Hive table's parameters. When it's > null, we put a <"comment", null> k-v in the parameters. Hive table doesn't > take null in its params. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-15253) Accumulators are not checkpointed
[ https://issues.apache.org/jira/browse/FLINK-15253?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16995960#comment-16995960 ] Seth Wiesman commented on FLINK-15253: -- -0 I kind of like accumulators. I think of them as the println debugging of metrics in Flink. I use them when I need to troubleshoot a job and I'm not sure where I'm looking but agreed that they are not appropriate for production streaming applications. > Accumulators are not checkpointed > - > > Key: FLINK-15253 > URL: https://issues.apache.org/jira/browse/FLINK-15253 > Project: Flink > Issue Type: Bug > Components: Runtime / Checkpointing >Reporter: Maximilian Michels >Priority: Major > > Accumulators are not checkpointed which make them relatively useless for > streaming applications. They are also tight to the heartbeat (FLINK-15252), > which makes them fragile. > We could consider deactivating accumulators in streaming mode since they seem > to originally be a batch feature. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
flinkbot commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565626033 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit f0d3c8c602d97efa0c69c945039248f9e90d407c (Fri Dec 13 22:05:54 UTC 2019) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15256) unable to drop table in HiveCatalogITCase
[ https://issues.apache.org/jira/browse/FLINK-15256?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-15256: --- Labels: pull-request-available (was: ) > unable to drop table in HiveCatalogITCase > - > > Key: FLINK-15256 > URL: https://issues.apache.org/jira/browse/FLINK-15256 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Terry Wang >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > > {code:java} > @Test > public void testCsvTableViaSQL() throws Exception { > EnvironmentSettings settings = > EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build(); > TableEnvironment tableEnv = TableEnvironment.create(settings); > tableEnv.registerCatalog("myhive", hiveCatalog); > tableEnv.useCatalog("myhive"); > String path = > this.getClass().getResource("/csv/test.csv").getPath(); > tableEnv.sqlUpdate("create table test2 (name String, age Int) > with (\n" + > " 'connector.type' = 'filesystem',\n" + > " 'connector.path' = 'file://" + path + "',\n" + > " 'format.type' = 'csv'\n" + > ")"); > Table t = tableEnv.sqlQuery("SELECT * FROM > myhive.`default`.test2"); > List result = TableUtils.collectToList(t); > // assert query result > assertEquals( > new HashSet<>(Arrays.asList( > Row.of("1", 1), > Row.of("2", 2), > Row.of("3", 3))), > new HashSet<>(result) > ); > tableEnv.sqlUpdate("drop table myhive.`default`.tests2"); > } > {code} > The last drop table statement reports error as: > {code:java} > org.apache.flink.table.api.ValidationException: Could not execute DropTable > in path `myhive`.`default`.`tests2` > at > org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:568) > at > org.apache.flink.table.catalog.CatalogManager.dropTable(CatalogManager.java:543) > at > org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlUpdate(TableEnvironmentImpl.java:519) > at > org.apache.flink.table.catalog.hive.HiveCatalogITCase.testCsvTableViaSQL(HiveCatalogITCase.java:123) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at > org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169) > at > org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154) > at > org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) > at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) > at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.junit.runners.ParentRunner.run(ParentRunner.java:363) > at org.junit.runner.JUnitCore.run(JUnitCore.java:137) > at > com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTest
[GitHub] [flink] flinkbot commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
flinkbot commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565625223 ## CI report: * 2e2ecdaa06d1efa17ceed187ccb310168aadb5e1 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
bowenli86 commented on issue #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577#issuecomment-565625092 thanks @zjuwangg for reporting this bug! cc @xuefuz @lirui-apache @JingsongLi This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15258) HiveModuleFactory should take hive-version as required supported property
[ https://issues.apache.org/jira/browse/FLINK-15258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15258: - Summary: HiveModuleFactory should take hive-version as required supported property (was: HiveModuleFactory doesn't take hive-version) > HiveModuleFactory should take hive-version as required supported property > - > > Key: FLINK-15258 > URL: https://issues.apache.org/jira/browse/FLINK-15258 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Affects Versions: 1.10.0 >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > > HiveModuleFactory should have hive-version as supported property -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] bowenli86 opened a new pull request #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property
bowenli86 opened a new pull request #10577: [FLINK-15256][hive] HiveModuleFactory should take hive-version as required supported property URL: https://github.com/apache/flink/pull/10577 ## What is the purpose of the change HiveModuleFactory should have hive-version as supported property ## Brief change log - added 'hive-version' as required supported property - added UT ## Verifying this change This change added tests and can be verified as `HiveModuleFactoryTest` ## Does this pull request potentially affect one of the following parts: n/a ## Documentation n/a This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Comment Edited] (FLINK-15234) hive table created from flink catalog table cannot have null properties in parameters
[ https://issues.apache.org/jira/browse/FLINK-15234?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16995786#comment-16995786 ] Bowen Li edited comment on FLINK-15234 at 12/13/19 9:55 PM: [~twalthr] sorry, didn't have time since it's too late last night. description added. thanks was (Author: phoenixjiangnan): [~twalthr] didn't have time since it's too late last night. description added. thanks > hive table created from flink catalog table cannot have null properties in > parameters > - > > Key: FLINK-15234 > URL: https://issues.apache.org/jira/browse/FLINK-15234 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > Time Spent: 20m > Remaining Estimate: 0h > > we store comment of a catalog table in Hive table's parameters. When it's > null, we put a <"comment", null> k-v in the parameters. Hive table doesn't > take null in its params. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-15258) HiveModuleFactory doesn't take hive-version
Bowen Li created FLINK-15258: Summary: HiveModuleFactory doesn't take hive-version Key: FLINK-15258 URL: https://issues.apache.org/jira/browse/FLINK-15258 Project: Flink Issue Type: Bug Components: Connectors / Hive Affects Versions: 1.10.0 Reporter: Bowen Li Assignee: Bowen Li Fix For: 1.10.0, 1.11.0 HiveModuleFactory should have hive-version as supported property -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
flinkbot commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565617576 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit 2e2ecdaa06d1efa17ceed187ccb310168aadb5e1 (Fri Dec 13 21:37:30 UTC 2019) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-15168) Exception is thrown when using kafka source connector with flink planner
[ https://issues.apache.org/jira/browse/FLINK-15168?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-15168: --- Labels: pull-request-available (was: ) > Exception is thrown when using kafka source connector with flink planner > > > Key: FLINK-15168 > URL: https://issues.apache.org/jira/browse/FLINK-15168 > Project: Flink > Issue Type: Bug > Components: Table SQL / Legacy Planner >Affects Versions: 1.10.0 >Reporter: Huang Xingbo >Assignee: Dawid Wysakowicz >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0 > > > when running the following case using kafka as source connector in flink > planner, we will get a RuntimeException: > {code:java} > StreamExecutionEnvironment env = > StreamExecutionEnvironment.getExecutionEnvironment(); > env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime); > env.setParallelism(1);StreamTableEnvironment tEnv = > StreamTableEnvironment.create(env);tEnv.connect(new Kafka() > .version("0.11") > .topic("user") > .startFromEarliest() > .property("zookeeper.connect", "localhost:2181") > .property("bootstrap.servers", "localhost:9092")) > .withFormat(new Json() > .failOnMissingField(true) > .jsonSchema("{" + > " type: 'object'," + > " properties: {" + > "a: {" + > " type: 'string'" + > "}," + > "b: {" + > " type: 'string'" + > "}," + > "c: {" + > " type: 'string'" + > "}," + > "time: {" + > " type: 'string'," + > " format: 'date-time'" + > "}" + > " }" + > "}" > )) > .withSchema(new Schema() > .field("rowtime", Types.SQL_TIMESTAMP) > .rowtime(new Rowtime() > .timestampsFromField("time") > .watermarksPeriodicBounded(6)) > .field("a", Types.STRING) > .field("b", Types.STRING) > .field("c", Types.STRING)) > .inAppendMode() > .registerTableSource("source");Table t = > tEnv.scan("source").select("a");tEnv.toAppendStream(t, Row.class).print(); > tEnv.execute("test"); > {code} > The RuntimeException detail: > {code:java} > Exception in thread "main" java.lang.RuntimeException: Error while applying > rule PushProjectIntoTableSourceScanRule, args > [rel#26:FlinkLogicalCalc.LOGICAL(input=RelSubset#25,expr#0..3={inputs},a=$t1), > Scan(table:[default_catalog, default_database, source], fields:(rowtime, a, > b, c), source:Kafka011TableSource(rowtime, a, b, c))]Exception in thread > "main" java.lang.RuntimeException: Error while applying rule > PushProjectIntoTableSourceScanRule, args > [rel#26:FlinkLogicalCalc.LOGICAL(input=RelSubset#25,expr#0..3={inputs},a=$t1), > Scan(table:[default_catalog, default_database, source], fields:(rowtime, a, > b, c), source:Kafka011TableSource(rowtime, a, b, c))] at > org.apache.calcite.plan.volcano.VolcanoRuleCall.onMatch(VolcanoRuleCall.java:235) > at > org.apache.calcite.plan.volcano.VolcanoPlanner.findBestExp(VolcanoPlanner.java:631) > at org.apache.calcite.tools.Programs$RuleSetProgram.run(Programs.java:327) > at > org.apache.flink.table.plan.Optimizer.runVolcanoPlanner(Optimizer.scala:280) > at > org.apache.flink.table.plan.Optimizer.optimizeLogicalPlan(Optimizer.scala:199) > at > org.apache.flink.table.plan.StreamOptimizer.optimize(StreamOptimizer.scala:66) > at > org.apache.flink.table.planner.StreamPlanner.translateToType(StreamPlanner.scala:389) > at > org.apache.flink.table.planner.StreamPlanner.org$apache$flink$table$planner$StreamPlanner$$translate(StreamPlanner.scala:180) > at > org.apache.flink.table.planner.StreamPlanner$$anonfun$translate$1.apply(StreamPlanner.scala:117) > at > org.apache.flink.table.planner.StreamPlanner$$anonfun$translate$1.apply(StreamPlanner.scala:117) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) > at scala.collection.Iterator$class.foreach(Iterator.scala:891) at > scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at > scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at > scala.collection.AbstractIterable.forea
[GitHub] [flink] bowenli86 commented on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
bowenli86 commented on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565616888 @xuefuz please take a look at #10575 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dawidwys commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing.
dawidwys commented on issue #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576#issuecomment-565616950 cc @wuchong This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] dawidwys opened a new pull request #10576: [FLINK-15168][table-planner] Fix physical indices computing.
dawidwys opened a new pull request #10576: [FLINK-15168][table-planner] Fix physical indices computing. URL: https://github.com/apache/flink/pull/10576 ## What is the purpose of the change Starting from this PR we use the schema that comes from CatalogTable instead of schema that comes from TableSource. Computing physical indices is based on the new type hierarchy instead of TypeInformation. ## Verifying this change This change added tests. ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): (yes / **no**) - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (**yes** / no) - The serializers: (yes / **no** / don't know) - The runtime per-record code paths (performance sensitive): (yes / **no** / don't know) - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (yes / **no** / don't know) - The S3 file system connector: (yes / **no** / don't know) ## Documentation - Does this pull request introduce a new feature? (yes / **no**) - If yes, how is the feature documented? (**not applicable** / docs / JavaDocs / not documented) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141014521) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3588) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565366031 ## CI report: * efd6f6b0a46a29a231c4d4e9c3f53e84b10e950b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/140921327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3559) * 4d0bf50014c0014a617f83d9fdc1b740d2948905 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/141004371) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3586) * b3de0165b172a21c25349ab2a3baf9dbb9f654b6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141008032) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3587) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-15257) convert HiveCatalogITCase.testCsvTableViaAPI() to use blink planner
Bowen Li created FLINK-15257: Summary: convert HiveCatalogITCase.testCsvTableViaAPI() to use blink planner Key: FLINK-15257 URL: https://issues.apache.org/jira/browse/FLINK-15257 Project: Flink Issue Type: Task Components: Connectors / Hive, Tests Reporter: Bowen Li Assignee: Terry Wang Fix For: 1.10.0, 1.11.0 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] bowenli86 closed pull request #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
bowenli86 closed pull request #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
bowenli86 commented on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565614162 close this in favor of https://github.com/apache/flink/pull/10575 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters
flinkbot commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565607933 ## CI report: * caf8475a447e2e685d7d26ba0630094f460bffe8 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565366031 ## CI report: * efd6f6b0a46a29a231c4d4e9c3f53e84b10e950b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/140921327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3559) * 4d0bf50014c0014a617f83d9fdc1b740d2948905 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/141004371) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3586) * b3de0165b172a21c25349ab2a3baf9dbb9f654b6 Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/141008032) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3587) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565366031 ## CI report: * efd6f6b0a46a29a231c4d4e9c3f53e84b10e950b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/140921327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3559) * 4d0bf50014c0014a617f83d9fdc1b740d2948905 Travis: [CANCELED](https://travis-ci.com/flink-ci/flink/builds/141004371) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3586) * b3de0165b172a21c25349ab2a3baf9dbb9f654b6 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141008032) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3587) Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters
flinkbot commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565597747 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit caf8475a447e2e685d7d26ba0630094f460bffe8 (Fri Dec 13 20:29:59 UTC 2019) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! * **Invalid pull request title: No valid Jira ID provided** Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters
bowenli86 commented on issue #10575: [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10575#issuecomment-565597426 @xuefuz @lirui-apache @JingsongLi @zjuwangg This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 opened a new pull request #10575: Hive2
bowenli86 opened a new pull request #10575: Hive2 URL: https://github.com/apache/flink/pull/10575 ## What is the purpose of the change discovered a couple bugs in HiveCatalog, related code seem to have been manipulated and introduced bugs but we didn't find it due to lack of test coverage. Fix them now and add more UT and IT. [FLINK-15234] hive table created from flink catalog table cannot have null properties in parameters - this one is when user created table doesn't have comment (null), we still put it into a hive table's params and HMS reports error. Solution: don't put 'comment' into hive table param if its value is null [FLINK-15240] is_generic key is missing for Flink table stored in HiveCatalog - the is_generic key is missing for generic tables stored in HiveCatalog. the expected behavior is: - When creating a table, A hive table needs explicitly have a key is_generic = false; otherwise, this is a generic table if 1) the key is missing 2) is_generic = true - When retrieving a table, a generic table needs explicitly have a key is_generic = true; otherwise, this is a Hive table if 1) the key is missing 2) is_generic = false ## Brief change log fix bugs mentioned above, added more UT and IT ## Verifying this change This change added tests and can be verified as `HiveCatalogTest` and `HiveCatalogITCase` ## Does this pull request potentially affect one of the following parts: n/a ## Documentation - Does this pull request introduce a new feature? (no) - If yes, how is the feature documented? (docs / JavaDocs) docs will be updated later This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-15256) unable to drop table in HiveCatalogITCase
Bowen Li created FLINK-15256: Summary: unable to drop table in HiveCatalogITCase Key: FLINK-15256 URL: https://issues.apache.org/jira/browse/FLINK-15256 Project: Flink Issue Type: Bug Components: Connectors / Hive Reporter: Bowen Li Assignee: Terry Wang Fix For: 1.10.0, 1.11.0 {code:java} @Test public void testCsvTableViaSQL() throws Exception { EnvironmentSettings settings = EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build(); TableEnvironment tableEnv = TableEnvironment.create(settings); tableEnv.registerCatalog("myhive", hiveCatalog); tableEnv.useCatalog("myhive"); String path = this.getClass().getResource("/csv/test.csv").getPath(); tableEnv.sqlUpdate("create table test2 (name String, age Int) with (\n" + " 'connector.type' = 'filesystem',\n" + " 'connector.path' = 'file://" + path + "',\n" + " 'format.type' = 'csv'\n" + ")"); Table t = tableEnv.sqlQuery("SELECT * FROM myhive.`default`.test2"); List result = TableUtils.collectToList(t); // assert query result assertEquals( new HashSet<>(Arrays.asList( Row.of("1", 1), Row.of("2", 2), Row.of("3", 3))), new HashSet<>(result) ); tableEnv.sqlUpdate("drop table myhive.`default`.tests2"); } {code} The last drop table statement reports error as: {code:java} org.apache.flink.table.api.ValidationException: Could not execute DropTable in path `myhive`.`default`.`tests2` at org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:568) at org.apache.flink.table.catalog.CatalogManager.dropTable(CatalogManager.java:543) at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlUpdate(TableEnvironmentImpl.java:519) at org.apache.flink.table.catalog.hive.HiveCatalogITCase.testCsvTableViaSQL(HiveCatalogITCase.java:123) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runTestMethod(FlinkStandaloneHiveRunner.java:169) at org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:154) at org.apache.flink.connectors.hive.FlinkStandaloneHiveRunner.runChild(FlinkStandaloneHiveRunner.java:92) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.junit.runner.JUnitCore.run(JUnitCore.java:137) at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68) at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47) at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242) at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70) Caused by: org.apache.f
[jira] [Created] (FLINK-15255) document how to create Hive table from java API and DDL
Bowen Li created FLINK-15255: Summary: document how to create Hive table from java API and DDL Key: FLINK-15255 URL: https://issues.apache.org/jira/browse/FLINK-15255 Project: Flink Issue Type: Task Components: Documentation Reporter: Bowen Li Assignee: Bowen Li Fix For: 1.10.0, 1.11.0 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15240) is_generic key is missing for Flink table stored in HiveCatalog
[ https://issues.apache.org/jira/browse/FLINK-15240?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15240: - Priority: Blocker (was: Major) > is_generic key is missing for Flink table stored in HiveCatalog > --- > > Key: FLINK-15240 > URL: https://issues.apache.org/jira/browse/FLINK-15240 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Fix For: 1.10.0, 1.11.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-15234) hive table created from flink catalog table cannot have null properties in parameters
[ https://issues.apache.org/jira/browse/FLINK-15234?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bowen Li updated FLINK-15234: - Priority: Blocker (was: Critical) > hive table created from flink catalog table cannot have null properties in > parameters > - > > Key: FLINK-15234 > URL: https://issues.apache.org/jira/browse/FLINK-15234 > Project: Flink > Issue Type: Bug > Components: Connectors / Hive >Reporter: Bowen Li >Assignee: Bowen Li >Priority: Blocker > Labels: pull-request-available > Fix For: 1.10.0, 1.11.0 > > Time Spent: 10m > Remaining Estimate: 0h > > we store comment of a catalog table in Hive table's parameters. When it's > null, we put a <"comment", null> k-v in the parameters. Hive table doesn't > take null in its params. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-15254) hive module cannot be named "hive"
Bowen Li created FLINK-15254: Summary: hive module cannot be named "hive" Key: FLINK-15254 URL: https://issues.apache.org/jira/browse/FLINK-15254 Project: Flink Issue Type: Test Components: Connectors / Hive Reporter: Bowen Li Assignee: Bowen Li Fix For: 1.10.0, 1.11.0 there seems to be a problem when a hive module is named "hive" and the module cannot be loaded/used properly. reported by [~Terry1897] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565366031 ## CI report: * efd6f6b0a46a29a231c4d4e9c3f53e84b10e950b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/140921327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3559) * 4d0bf50014c0014a617f83d9fdc1b740d2948905 Travis: [PENDING](https://travis-ci.com/flink-ci/flink/builds/141004371) Azure: [PENDING](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3586) * b3de0165b172a21c25349ab2a3baf9dbb9f654b6 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-15226) Running job with application parameters fails on a job cluster
[ https://issues.apache.org/jira/browse/FLINK-15226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16995863#comment-16995863 ] Biju Nair commented on FLINK-15226: --- Passing in the {{-configDir}} before the application parameters is the temporary solution we are using to proceed further. It would be better to do the same in the Flink jobmanager script i.e adding it before the application parameters like [here|https://github.com/apache/flink/blob/d43a1dd397dbc7c0559a07b83380fed164114241/flink-dist/src/main/flink-bin/bin/taskmanager.sh#L25] instead of where it is appended [now|https://github.com/apache/flink/blob/d43a1dd397dbc7c0559a07b83380fed164114241/flink-dist/src/main/flink-bin/bin/taskmanager.sh#L73]. Not sure whether the change will have other side effects. > Running job with application parameters fails on a job cluster > -- > > Key: FLINK-15226 > URL: https://issues.apache.org/jira/browse/FLINK-15226 > Project: Flink > Issue Type: Bug > Components: Runtime / Task >Reporter: Biju Nair >Priority: Minor > > Trying to move a job which takes in application parameters running on a > session cluster to a job cluster and it fails with the following error in the > task manager > {noformat} > 2019-11-23 01:29:16,498 ERROR > org.apache.flink.runtime.taskexecutor.TaskManagerRunner - Could not > parse the command line options. > org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to parse the > command line arguments. > at > org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52) > at > org.apache.flink.runtime.taskexecutor.TaskManagerRunner.loadConfiguration(TaskManagerRunner.java:315) > at > org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:284) > Caused by: org.apache.commons.cli.MissingOptionException: Missing required > option: c > at > org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199) > at org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130) > at org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81) > at > org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50) > ... 2 more > usage: TaskManagerRunner -c [-D ] > -c,--configDir Directory which contains the > configuration file > flink-conf.yml. > -D use value for given property > Exception in thread "main" > org.apache.flink.runtime.entrypoint.FlinkParseException: Failed to parse the > command line arguments. > at > org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:52) > at > org.apache.flink.runtime.taskexecutor.TaskManagerRunner.loadConfiguration(TaskManagerRunner.java:315) > at > org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:284) > Caused by: org.apache.commons.cli.MissingOptionException: Missing required > option: c > at > org.apache.commons.cli.DefaultParser.checkRequiredOptions(DefaultParser.java:199) > at org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:130) > at org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:81) > at > org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:50) > ... 2 more{noformat} > Looking at the code, this may be due to adding the {{configDir}} parameter at > the end in \{{taskManagers.sh}} script > [here|https://github.com/apache/flink/blob/d43a1dd397dbc7c0559a07b83380fed164114241/flink-dist/src/main/flink-bin/bin/taskmanager.sh#L73]. > Based on the commandline parser > [logic|https://github.com/apache/flink/blob/6258a4c333ce9dba914621b13eac2f7d91f5cb72/flink-runtime/src/main/java/org/apache/flink/runtime/entrypoint/parser/CommandLineParser.java#L50], > parsing will stop once a parameter with out {{-D-}} or {{-configDir/-c}} is > encountered which in this case is true. If this diagnosis is correct, can > the {{configDir}} parameter can be added to the beginning instead of the end > in the args list in the job manager shell script? Thanks for your input in > advance. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
flinkbot edited a comment on issue #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#issuecomment-565366031 ## CI report: * efd6f6b0a46a29a231c4d4e9c3f53e84b10e950b Travis: [FAILURE](https://travis-ci.com/flink-ci/flink/builds/140921327) Azure: [FAILURE](https://dev.azure.com/rmetzger/5bd3ef0a-4359-41af-abca-811b04098d2e/_build/results?buildId=3559) * 4d0bf50014c0014a617f83d9fdc1b740d2948905 UNKNOWN Bot commands The @flinkbot bot supports the following commands: - `@flinkbot run travis` re-run the last Travis build - `@flinkbot run azure` re-run the last Azure build This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on a change in pull request #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters
xuefuz commented on a change in pull request #10566: [FLINK-15234][hive] hive table created from flink catalog table cannot have null properties in parameters URL: https://github.com/apache/flink/pull/10566#discussion_r357791645 ## File path: flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java ## @@ -559,24 +562,34 @@ private CatalogBaseTable instantiateCatalogTable(Table hiveTable, HiveConf hiveC } } - private static Table instantiateHiveTable(ObjectPath tablePath, CatalogBaseTable table) { + @VisibleForTesting + protected static Table instantiateHiveTable(ObjectPath tablePath, CatalogBaseTable table) { // let Hive set default parameters for us, e.g. serialization.format Table hiveTable = org.apache.hadoop.hive.ql.metadata.Table.getEmptyTable(tablePath.getDatabaseName(), tablePath.getObjectName()); hiveTable.setCreateTime((int) (System.currentTimeMillis() / 1000)); Map properties = new HashMap<>(table.getProperties()); // Table comment - properties.put(HiveCatalogConfig.COMMENT, table.getComment()); - - boolean isGeneric = Boolean.valueOf(properties.get(CatalogConfig.IS_GENERIC)); + if (table.getComment() != null) { + properties.put(HiveCatalogConfig.COMMENT, table.getComment()); + } - if (isGeneric) { + // When creating a table, A hive table needs explicitly have a key is_generic = false + // otherwise, this is a generic table if 1) the key is missing 2) is_generic = true + // this is opposite to reading a table and instantiating a CatalogTable. See instantiateCatalogTable() + String key = properties.get(CatalogConfig.IS_GENERIC); + if (key == null || Boolean.valueOf(key)) { properties = maskFlinkProperties(properties); + properties.put(CatalogConfig.IS_GENERIC, String.valueOf(true)); } + Review comment: Name "key" might be misleading, as it actually is the value of a potential key. "isGeneric" seems to be a better name. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on a change in pull request #10543: [FLINK-15128][hive][doc] Document support for Hive timestamp type
bowenli86 commented on a change in pull request #10543: [FLINK-15128][hive][doc] Document support for Hive timestamp type URL: https://github.com/apache/flink/pull/10543#discussion_r357784169 ## File path: docs/dev/table/hive/index.md ## @@ -303,5 +303,6 @@ Currently `HiveCatalog` supports most Flink data types with the following mappin Note that: * Flink doesn't support Hive's `UNION` type is not supported +* Hive's `TIMESTAMP` always has precision 9 and doesn't support other precisions. As a result, `HiveCatalog` cannot store `TIMESTAMP` columns whose precisions are not 9. Hive UDFs, on the other hand, can process `TIMESTAMP` values with a precision <= 9. Review comment: > As a result, `HiveCatalog` cannot store `TIMESTAMP` columns whose precisions are not 9 sorry if I misunderstand it as the conversation has been quite complex so far - should be ok to store precision less or equal to 9, right? if it's say 6, users don't loose any data, we can just append more units to the original data, no? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services