[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-08-04 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r938372682


##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/SqlITCaseBase.java:
##
@@ -0,0 +1,178 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.TestLogger;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.api.Assertions.assertTrue;
+
+/** Base class for sql ITCase. */
+@RunWith(Parameterized.class)
+public abstract class SqlITCaseBase extends TestLogger {
+private static final Logger LOG = 
LoggerFactory.getLogger(SqlITCaseBase.class);
+
+@Parameterized.Parameters(name = "executionMode")
+public static Collection data() {
+return Arrays.asList("streaming", "batch");
+}
+
+@Rule public final FlinkResource flink;
+
+@Rule public final TemporaryFolder tmp = new TemporaryFolder();
+
+private final String executionMode;
+
+private Path result;
+
+protected static final Path SQL_TOOL_BOX_JAR = 
TestUtils.getResource(".*SqlToolbox.jar");
+
+public SqlITCaseBase(String executionMode, Configuration configuration) {
+this.executionMode = executionMode;
+this.flink =
+new LocalStandaloneFlinkResourceFactory()
+.create(
+FlinkResourceSetup.builder()
+.addConfiguration(configuration)
+.build());
+}
+
+@Before
+public void before() throws Exception {
+Path tmpPath = tmp.getRoot().toPath();
+LOG.info("The current temporary path: {}", tmpPath);
+this.result = tmpPath.resolve(String.format("result-%s", 
UUID.randomUUID()));
+}
+
+public void runAndCheckSQL(
+String sqlPath, Map varsMap, int resultSize, 
List resultItems)
+throws Exception {
+try (ClusterController clusterController = flink.startCluster(1)) {
+List sqlLines = initializeSqlLines(sqlPath, varsMap);
+
+executeSqlStatements(clusterController, sqlLines);
+
+// Wait until all the results flushed to the json file.
+LOG.info("Verify the json result.");
+checkJsonResultFile(resultSize, resultItems);
+LOG.info("The codegen SQL client test run successfully.");

Review Comment:
   ```suggestion
   LOG.info("The SQL client test run successfully.");
   ```



##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/UsingRemoteJarITCaseBase.java:
##
@@ -0,0 +1,162 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the 

[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-08-04 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r937512920


##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/resources/log4j2-test.properties:
##
@@ -18,7 +18,7 @@
 
 # Set root logger level to OFF to not flood build logs
 # set manually to INFO for debugging purposes
-rootLogger.level = OFF
+rootLogger.level = INFO

Review Comment:
   Please revert this change.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-08-04 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r937322762


##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/SqlUdfBaseITCase.java:
##
@@ -0,0 +1,173 @@
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.TestLogger;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.hamcrest.Matchers.arrayContainingInAnyOrder;
+import static org.junit.Assert.assertThat;
+import static org.junit.Assert.assertTrue;
+
+/** Base class for sql udf. */
+@RunWith(Parameterized.class)
+public abstract class SqlUdfBaseITCase extends TestLogger {
+private static final Logger LOG = 
LoggerFactory.getLogger(PlannerScalaFreeITCase.class);

Review Comment:
   ```suggestion
   private static final Logger LOG = 
LoggerFactory.getLogger(SqlITCaseBase.class);
   ```



##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/SqlUdfBaseITCase.java:
##
@@ -0,0 +1,173 @@
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.TestLogger;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.hamcrest.Matchers.arrayContainingInAnyOrder;
+import static org.junit.Assert.assertThat;
+import static org.junit.Assert.assertTrue;
+
+/** Base class for sql udf. */
+@RunWith(Parameterized.class)
+public abstract class SqlUdfBaseITCase extends TestLogger {

Review Comment:
   This module is not only for tests sql udf, so this class name is not 
correct. `SqlITCaseBase` maybe more suitable.



##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/SqlUdfBaseITCase.java:
##
@@ -0,0 +1,173 @@
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.TestLogger;
+
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.hamcrest.Matchers.arrayContainingInAnyOrder;
+import static org.junit.Assert.assertThat;
+import static 

[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-08-03 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r937269900


##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/AddRemoteJarITCase.java:
##
@@ -0,0 +1,280 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.test.util.SQLJobSubmission;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.FileUtils;
+import org.apache.flink.util.OperatingSystem;
+import org.apache.flink.util.TestLogger;
+import org.apache.flink.util.UserClassLoaderJarTestUtils;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileUtil;
+import org.apache.hadoop.hdfs.MiniDFSCluster;
+import org.junit.After;
+import org.junit.Assert;
+import org.junit.Assume;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.ClassRule;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.Assert.assertTrue;
+
+/** ITCase for adding remote jar. */
+@RunWith(Parameterized.class)
+public class AddRemoteJarITCase extends TestLogger {
+private static final Logger LOG = 
LoggerFactory.getLogger(AddRemoteJarITCase.class);
+
+private static final String ADD_REMOTE_JAR_E2E_SQL = 
"add_remote_jar_e2e.sql";
+
+private static final String EXECUTE_SQL_RESULT =
+"{\"before\":null,\"after\":{\"id\":1,\"content\":\"hello 
world\"},\"op\":\"c\"}";
+
+public static final String GENERATED_LOWER_UDF_CLASS = "LowerUDF";
+
+public static final String GENERATED_LOWER_UDF_CODE =
+"public class "
++ "%s"
++ " extends 
org.apache.flink.table.functions.ScalarFunction {\n"
++ "  public String eval(String str) {\n"
++ "return str.toLowerCase();\n"
++ "  }\n"
++ "}\n";
+
+@ClassRule public static TemporaryFolder tempFolder = new 
TemporaryFolder();
+
+private static final Path hadoopClasspath = 
TestUtils.getResource(".*hadoop.classpath");
+
+@Parameterized.Parameters(name = "executionMode")
+public static Collection data() {
+return Arrays.asList("streaming", "batch");
+}
+
+@Rule
+public final FlinkResource flink =
+new 
LocalStandaloneFlinkResourceFactory().create(FlinkResourceSetup.builder().build());
+
+private final String executionMode;
+private Path result;
+
+private MiniDFSCluster hdfsCluster;

Review Comment:
   Ok, I think your concern make sense to me.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-08-01 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r934269500


##
flink-test-utils-parent/flink-test-utils/src/main/java/org/apache/flink/test/util/SQLJobSubmission.java:
##
@@ -29,10 +31,13 @@ public class SQLJobSubmission {
 
 private final List sqlLines;
 private final List jars;
+private final Consumer> envProcessor;

Review Comment:
   Revert these changes?



##
flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/flink/FlinkDistribution.java:
##
@@ -220,6 +220,7 @@ public void submitSQLJob(SQLJobSubmission job, Duration 
timeout) throws IOExcept
 AutoClosableProcess.create(commands.toArray(new String[0]))
 .setStdInputs(job.getSqlLines().toArray(new String[0]))
 .setStdoutProcessor(LOG::info) // logging the SQL statements 
and error message
+.setEnv(job.getEnvProcessor())

Review Comment:
   This change is not needed now?



##
flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/api/TableEnvironmentTest.scala:
##
@@ -155,6 +164,23 @@ class TableEnvironmentTest {
 verifyTableEnvironmentExecutionExplain(tEnv)
   }
 
+  @Test
+  def testAddAndShowJar(): Unit = {
+val jarPath = UserClassLoaderJarTestUtils
+  .createJarFile(
+tempFolder.newFolder(String.format("test-jar-%s", UUID.randomUUID)),
+"test-classloader-udf.jar",
+GENERATED_LOWER_UDF_CLASS,
+String.format(GENERATED_LOWER_UDF_CODE, GENERATED_LOWER_UDF_CLASS)
+  )
+  .getPath
+
+tableEnv.executeSql("add JAR '" + jarPath + "'")

Review Comment:
   please use String.format and use upper or lower case uniformly.



##
flink-table/flink-sql-client/src/main/java/org/apache/flink/table/client/gateway/context/SessionContext.java:
##
@@ -274,6 +272,7 @@ public void removeJar(String jarPath) {
 classLoader.removeURL(jarURL);
 }
 
+// TODO: Only related to removeJar in test. Remove it once it has no use.

Review Comment:
   Ditto



##
flink-end-to-end-tests/flink-end-to-end-tests-sql/src/test/java/org/apache/flink/table/sql/codegen/AddRemoteJarITCase.java:
##
@@ -0,0 +1,280 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.sql.codegen;
+
+import org.apache.flink.api.common.time.Deadline;
+import org.apache.flink.test.util.SQLJobSubmission;
+import org.apache.flink.tests.util.TestUtils;
+import org.apache.flink.tests.util.flink.ClusterController;
+import org.apache.flink.tests.util.flink.FlinkResource;
+import org.apache.flink.tests.util.flink.FlinkResourceSetup;
+import org.apache.flink.tests.util.flink.LocalStandaloneFlinkResourceFactory;
+import org.apache.flink.util.FileUtils;
+import org.apache.flink.util.OperatingSystem;
+import org.apache.flink.util.TestLogger;
+import org.apache.flink.util.UserClassLoaderJarTestUtils;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileUtil;
+import org.apache.hadoop.hdfs.MiniDFSCluster;
+import org.junit.After;
+import org.junit.Assert;
+import org.junit.Assume;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.ClassRule;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.net.URL;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.time.Duration;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.UUID;
+
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.Assert.assertTrue;
+
+/** ITCase for adding remote jar. */
+@RunWith(Parameterized.class)
+public class AddRemoteJarITCase extends TestLogger {
+private static final Logger LOG = 
LoggerFactory.getLogger(AddRemoteJarITCase.class);
+
+

[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-07-26 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r929988353


##
flink-table/flink-sql-client/src/test/java/org/apache/flink/table/client/gateway/context/AddRemoteJarITCase.java:
##
@@ -0,0 +1,110 @@
+package org.apache.flink.table.client.gateway.context;
+
+import org.apache.flink.client.cli.DefaultCLI;
+import org.apache.flink.util.OperatingSystem;
+import org.apache.flink.util.UserClassLoaderJarTestUtils;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileUtil;
+import org.apache.hadoop.hdfs.MiniDFSCluster;
+import org.junit.After;
+import org.junit.Assert;
+import org.junit.Assume;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.ClassRule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.io.File;
+import java.io.IOException;
+import java.util.Collections;
+
+import static org.apache.flink.configuration.PipelineOptions.MAX_PARALLELISM;
+import static org.apache.flink.configuration.PipelineOptions.OBJECT_REUSE;
+import static 
org.apache.flink.table.utils.UserDefinedFunctions.GENERATED_LOWER_UDF_CLASS;
+import static 
org.apache.flink.table.utils.UserDefinedFunctions.GENERATED_LOWER_UDF_CODE;
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** ITCase for adding remote jar. */
+public class AddRemoteJarITCase {
+@ClassRule public static TemporaryFolder tempFolder = new 
TemporaryFolder();
+

Review Comment:
   We should not introduce this test here, the local jar tests has been covered 
by existing test. Regarding to remote jar test, it should be introduced in e2e 
module.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-07-26 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r92650


##
flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/internal/TableEnvironmentImpl.java:
##
@@ -461,6 +463,23 @@ public boolean dropTemporaryFunction(String path) {
 return 
functionCatalog.dropTemporaryCatalogFunction(unresolvedIdentifier, true);
 }
 
+@Override
+public void addJar(String jarPath) {

Review Comment:
   Please add UT about `ADD Jar` and `SHOW JARS` in `TableEnvironmentTest`, add 
IT case about `ADD Jar` in `TableEnvironmentITCase` which you can refer to the 
related tests in `FunctionITCase`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [flink] lsyldliu commented on a diff in pull request #20361: [FLINK-27790][Table SQL / API] Port ADD JAR /SHOW JARS syntax implementation from SqlClient to TableEnvironment side

2022-07-26 Thread GitBox


lsyldliu commented on code in PR #20361:
URL: https://github.com/apache/flink/pull/20361#discussion_r929988353


##
flink-table/flink-sql-client/src/test/java/org/apache/flink/table/client/gateway/context/AddRemoteJarITCase.java:
##
@@ -0,0 +1,110 @@
+package org.apache.flink.table.client.gateway.context;
+
+import org.apache.flink.client.cli.DefaultCLI;
+import org.apache.flink.util.OperatingSystem;
+import org.apache.flink.util.UserClassLoaderJarTestUtils;
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileUtil;
+import org.apache.hadoop.hdfs.MiniDFSCluster;
+import org.junit.After;
+import org.junit.Assert;
+import org.junit.Assume;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.ClassRule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.io.File;
+import java.io.IOException;
+import java.util.Collections;
+
+import static org.apache.flink.configuration.PipelineOptions.MAX_PARALLELISM;
+import static org.apache.flink.configuration.PipelineOptions.OBJECT_REUSE;
+import static 
org.apache.flink.table.utils.UserDefinedFunctions.GENERATED_LOWER_UDF_CLASS;
+import static 
org.apache.flink.table.utils.UserDefinedFunctions.GENERATED_LOWER_UDF_CODE;
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** ITCase for adding remote jar. */
+public class AddRemoteJarITCase {
+@ClassRule public static TemporaryFolder tempFolder = new 
TemporaryFolder();
+

Review Comment:
   We should not introduce this test, the local jar tests has been covered by 
existing test. Regarding to remote jar test, it should be introduced in e2e 
module.



##
flink-table/flink-sql-client/src/test/java/org/apache/flink/table/client/gateway/context/SessionContextTest.java:
##
@@ -156,13 +156,7 @@ public void testAddJarWithRelativePath() throws 
IOException {
 
 @Test
 public void testAddIllegalJar() {
-validateAddJarWithException("/path/to/illegal.jar", "JAR file does not 
exist");
-}
-
-@Test
-public void testAddRemoteJar() {
-validateAddJarWithException(

Review Comment:
   This test should not be removed.



##
flink-table/flink-sql-client/pom.xml:
##
@@ -515,6 +522,26 @@ under the License.
provided

 
+   
+   org.apache.hadoop
+   hadoop-hdfs
+   test
+   
+
+   
+   org.apache.hadoop
+   hadoop-hdfs

Review Comment:
   I think introducing hdfs dependency is a little heavy, we should test remote 
jar in e2e module, so I think these related test should be placed in 
flink-end-to-end-tests-sql module.
   
   Regarding to how to use hdfs cluster in e2e test, you can refer to the 
flink-end-to-end-tests-hbase module.



##
flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/internal/TableEnvironmentImpl.java:
##
@@ -461,6 +463,23 @@ public boolean dropTemporaryFunction(String path) {
 return 
functionCatalog.dropTemporaryCatalogFunction(unresolvedIdentifier, true);
 }
 
+@Override
+public void addJar(String jarPath) {

Review Comment:
   Please add UT about `ADD Jar` and `SHOW JARS` in `TableEnvironmentITCase`, 
add IT case about `ADD Jar` in `TableEnvironmentITCase` which you can refer to 
the related tests in `FunctionITCase`.



##
flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/TableEnvironment.java:
##
@@ -575,6 +576,20 @@ void createFunction(
  */
 boolean dropTemporaryFunction(String path);
 
+/**
+ * Add jar to the classLoader for use.
+ *
+ * @param jarPath The jar path to be added.
+ */
+void addJar(String jarPath);

Review Comment:
   We should not introducing these two method, they are public api, we cannot 
introduce public api arbitrarily, it should be discussed in community. We 
should reuse the `AddJarOperation` and `ShowJarsOperation`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org