Re: Weird Flink SQL error

2022-11-23 Thread Shuiqiang Chen
Hi Dan,

Which Flink version do you apply? I write a test case base on the code
snippet you provided and it works normally in Flink 1.17-SNAPSHOT.

Best,
Shuiqiang

Dan Hill  于2022年11月23日周三 13:55写道:

> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way to
> get better errors for Flink SQL?  I'm hitting it when I wrap some of the
> fields on an inner Row.
>
>
> *Works*
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> platform_id,
> content_id
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`platform_id` BIGINT,
>`content_id` STRING
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>
> *Fails*
>
> Wrapping a couple parameters in a Row causes the following exception.
>
>  Caused by: org.apache.flink.sql.parser.impl.ParseException: Encountered 
> "." at line 1, column 119.
> Was expecting one of:
> ")" ...
> "," ...
>
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>[...]
>
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> ROW(
> platform_id,
> content_id
> )
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`body` ROW(
>`platform_id` BIGINT,
>`content_id` STRING
>)
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>


Re: Weird Flink SQL error

2022-11-23 Thread yuxia
Hi, Dan. 
I'm wondering what type of error you expect. IMO, I think most engines throw 
parse error in such way which tell you encounter an unexpected token. 

Best regards, 
Yuxia 


发件人: "Dan Hill"  
收件人: "User"  
发送时间: 星期三, 2022年 11 月 23日 下午 1:55:20 
主题: Weird Flink SQL error 

Hi. I'm hitting an obfuscated Flink SQL parser error. Is there a way to get 
better errors for Flink SQL? I'm hitting it when I wrap some of the fields on 
an inner Row. 


Works 
CREATE TEMPORARY VIEW `test_content_metrics_view` AS 
SELECT 
DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY ), '-MM-dd' ), 
platform_id, 
content_id 
FROM content_event 
GROUP BY 
platform_id, 
content_id, 
TUMBLE(rowtime, INTERVAL '1' DAY ) 

CREATE TABLE test_content_metrics ( 
dt STRING NOT NULL , 
`platform_id` BIGINT, 
`content_id` STRING 
) PARTITIONED BY (dt) WITH ( 
'connector' = 'filesystem' , 
'path' = 'etl/test_content_metrics' , 
'format' = 'json' , 
) 

INSERT INTO `test_content_metrics` 
SELECT * FROM `test_content_metrics_view` 

Fails 

Wrapping a couple parameters in a Row causes the following exception. 

Caused by : org.apache.flink. sql .parser.impl.ParseException: Encountered "." 
at line 1 , column 119 . 
Was expecting one of : 
")" ... 
"," ... 

org.apache.flink. sql 
.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java: 
40981 ) 
org.apache.flink. sql 
.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java: 40792 
) 
org.apache.flink. sql 
.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:
 25220 ) 
org.apache.flink. sql 
.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java: 19925 ) 
org.apache.flink. sql 
.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java: 19581 ) 
[...] 

CREATE TEMPORARY VIEW `test_content_metrics_view` AS 
SELECT 
DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY ), '-MM-dd' ), 
ROW( 
platform_id, 
content_id 
) 
FROM content_event 
GROUP BY 
platform_id, 
content_id, 
TUMBLE(rowtime, INTERVAL '1' DAY ) 

CREATE TABLE test_content_metrics ( 
dt STRING NOT NULL , 
`body` ROW( 
`platform_id` BIGINT, 
`content_id` STRING 
) 
) PARTITIONED BY (dt) WITH ( 
'connector' = 'filesystem' , 
'path' = 'etl/test_content_metrics' , 
'format' = 'json' , 
) 

INSERT INTO `test_content_metrics` 
SELECT * FROM `test_content_metrics_view` 



Unable to write checkpoint metadata to s3 using pyflink (1.16.0)

2022-11-23 Thread Mujahid Niaz
Hi team,
Hope Everyone is doing good,

We have an issue regarding writing checkpoints metadata to S3 using pyflink
datastream api. we are using Apache-Flink==1.16.0. We are able to sink our
Stream into s3 but when it comes to writing checkpoint data. we are getting
the following error.
We tried a path with *s3:// s3a://, s3p:// prefixes *but it failed with all
of them.

*(Code is attached after the error message)*

Looking forward to hearing from you. Thank you

Traceback (most recent call last):
  File "app.py", line 103, in 
real_time_data_analytics()
  File "app.py", line 98, in real_time_data_analytics
env.execute('bot_detection_app_local2')
  File 
"/test/lib/python3.8/site-packages/pyflink/datastream/stream_execution_environment.py",
line 764, in execute
return 
JobExecutionResult(self._j_stream_execution_environment.execute(j_stream_graph))
  File "/test/lib/python3.8/site-packages/py4j/java_gateway.py", line
1321, in __call__
return_value = get_return_value(
  File "/test/lib/python3.8/site-packages/pyflink/util/exceptions.py",
line 146, in deco
return f(*a, **kw)
  File "/test/lib/python3.8/site-packages/py4j/protocol.py", line 326,
in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o49.execute.
: org.apache.flink.util.FlinkException: Failed to execute job
'bot_detection_app_local2'.
at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:2203)
at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:2075)
at 
org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:68)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at 
org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at 
org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at 
org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at 
org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at 
org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at 
org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.RuntimeException:
org.apache.flink.runtime.client.JobInitializationException: Could not
start the JobMaster.
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:321)
at 
org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:75)
at 
java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at 
java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:479)
at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at 
java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.client.JobInitializationException:
Could not start the JobMaster.
at 
org.apache.flink.runtime.jobmaster.DefaultJobMasterServiceProcess.lambda$new$0(DefaultJobMasterServiceProcess.java:97)
at 
java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at 
java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at 
java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at 
java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.CompletionException:
org.apache.flink.util.FlinkRuntimeException: Failed to create
checkpoint storage at checkpoint coordinator side.
at 
java.base/java.util.concurrent.CompletableFuture.encodeThrowable(C

Re: Unable to write checkpoint metadata to s3 using pyflink (1.16.0)

2022-11-23 Thread Martijn Visser
Hi,

Like the error mentions "The scheme is directly supported by Flink through
the following plugin(s): flink-s3-fs-presto. Please ensure that each plugin
resides within its own subfolder within the plugins directory."

You can't add the plugins via code; you need to follow the instructions in
the documentation.

Best regards,

Martijn

On Wed, Nov 23, 2022 at 1:53 PM Mujahid Niaz 
wrote:

> Hi team,
> Hope Everyone is doing good,
>
> We have an issue regarding writing checkpoints metadata to S3 using
> pyflink datastream api. we are using Apache-Flink==1.16.0. We are able to
> sink our Stream into s3 but when it comes to writing checkpoint data. we
> are getting the following error.
> We tried a path with *s3:// s3a://, s3p:// prefixes *but it failed with
> all of them.
>
> *(Code is attached after the error message)*
>
> Looking forward to hearing from you. Thank you
>
> Traceback (most recent call last):
>   File "app.py", line 103, in 
> real_time_data_analytics()
>   File "app.py", line 98, in real_time_data_analytics
> env.execute('bot_detection_app_local2')
>   File 
> "/test/lib/python3.8/site-packages/pyflink/datastream/stream_execution_environment.py",
>  line 764, in execute
> return 
> JobExecutionResult(self._j_stream_execution_environment.execute(j_stream_graph))
>   File "/test/lib/python3.8/site-packages/py4j/java_gateway.py", line 1321, 
> in __call__
> return_value = get_return_value(
>   File "/test/lib/python3.8/site-packages/pyflink/util/exceptions.py", line 
> 146, in deco
> return f(*a, **kw)
>   File "/test/lib/python3.8/site-packages/py4j/protocol.py", line 326, in 
> get_return_value
> raise Py4JJavaError(
> py4j.protocol.Py4JJavaError: An error occurred while calling o49.execute.
> : org.apache.flink.util.FlinkException: Failed to execute job 
> 'bot_detection_app_local2'.
> at 
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:2203)
> at 
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:2075)
> at 
> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:68)
> at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> at 
> org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
> at 
> org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
> at 
> org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
> at 
> org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
> at 
> org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
> at 
> org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
> at java.base/java.lang.Thread.run(Thread.java:829)
> Caused by: java.lang.RuntimeException: 
> org.apache.flink.runtime.client.JobInitializationException: Could not start 
> the JobMaster.
> at 
> org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:321)
> at 
> org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:75)
> at 
> java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
> at 
> java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:479)
> at 
> java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
> at 
> java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
> at 
> java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
> at 
> java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
> at 
> java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
> Caused by: org.apache.flink.runtime.client.JobInitializationException: Could 
> not start the JobMaster.
> at 
> org.apache.flink.runtime.jobmaster.DefaultJobMasterServiceProcess.lambda$new$0(DefaultJobMasterServiceProcess.java:97)
> at 
> java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
> at 
> java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
> at 
> java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
> at 
> java.base/ja

Re: Weird Flink SQL error

2022-11-23 Thread Dan Hill
I'm using Flink 1.14.4

On Wed, Nov 23, 2022, 02:28 yuxia  wrote:

> Hi, Dan.
> I'm wondering what type of error you expect. IMO, I think most engines
> throw parse error in such way which tell you encounter an unexpected token.
>
> Best regards,
> Yuxia
>
> --
> *发件人: *"Dan Hill" 
> *收件人: *"User" 
> *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
> *主题: *Weird Flink SQL error
>
> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way to
> get better errors for Flink SQL?  I'm hitting it when I wrap some of the
> fields on an inner Row.
>
>
> *Works*
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> platform_id,
> content_id
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`platform_id` BIGINT,
>`content_id` STRING
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>
> *Fails*
>
> Wrapping a couple parameters in a Row causes the following exception.
>
>  Caused by: org.apache.flink.sql.parser.impl.ParseException: Encountered 
> "." at line 1, column 119.
> Was expecting one of:
> ")" ...
> "," ...
>
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>[...]
>
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> ROW(
> platform_id,
> content_id
> )
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`body` ROW(
>`platform_id` BIGINT,
>`content_id` STRING
>)
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>
>


Query about flink job manager dashboard

2022-11-23 Thread naga sudhakar
> Hi Team,
> Greetings!!!
> I am a software developer using apache flink and deploying flink jobs
> using the same. I have two queries about flink job manager dashboard. Can
> you please help with below?
>
> 1) is it possible to add login mechanism for the flink job manager dash
> board and have a role based mechanism for viewing running jobs, cancelling
> jobs, adding the jobs?
> 2) is it possible to disable to dash bord display but use api to do the
> same operations using API?
>
>
> Thanks,
> Nagasudhakar.
>


Re: Weird Flink SQL error

2022-11-23 Thread Dan Hill
For the error `Encountered "." at line 1, column 119.`, here are the
confusing parts:

1. The error happens when I executed the last part of the sql query:

INSERT INTO `test_content_metrics`
SELECT * FROM `test_content_metrics_view`

2. Line 1 column 119 doesn't exist in that SQL statement.
3. None of the SQL that I've written has a period "." in it.



On Wed, Nov 23, 2022 at 8:32 AM Dan Hill  wrote:

> I'm using Flink 1.14.4
>
> On Wed, Nov 23, 2022, 02:28 yuxia  wrote:
>
>> Hi, Dan.
>> I'm wondering what type of error you expect. IMO, I think most engines
>> throw parse error in such way which tell you encounter an unexpected token.
>>
>> Best regards,
>> Yuxia
>>
>> --
>> *发件人: *"Dan Hill" 
>> *收件人: *"User" 
>> *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
>> *主题: *Weird Flink SQL error
>>
>> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way to
>> get better errors for Flink SQL?  I'm hitting it when I wrap some of the
>> fields on an inner Row.
>>
>>
>> *Works*
>>
>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>> SELECT
>> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
>> platform_id,
>> content_id
>> FROM content_event
>> GROUP BY
>> platform_id,
>> content_id,
>> TUMBLE(rowtime, INTERVAL '1' DAY)
>>
>> CREATE TABLE test_content_metrics (
>>dt STRING NOT NULL,
>>`platform_id` BIGINT,
>>`content_id` STRING
>> ) PARTITIONED BY (dt) WITH (
>>'connector' = 'filesystem',
>>'path' = 'etl/test_content_metrics',
>>'format' = 'json',
>> )
>>
>> INSERT INTO `test_content_metrics`
>> SELECT * FROM `test_content_metrics_view`
>>
>>
>> *Fails*
>>
>> Wrapping a couple parameters in a Row causes the following exception.
>>
>>  Caused by: org.apache.flink.sql.parser.impl.ParseException: Encountered 
>> "." at line 1, column 119.
>> Was expecting one of:
>> ")" ...
>> "," ...
>>
>>
>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>>
>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>>
>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>>
>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>>
>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>>[...]
>>
>>
>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>> SELECT
>> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
>> ROW(
>> platform_id,
>> content_id
>> )
>> FROM content_event
>> GROUP BY
>> platform_id,
>> content_id,
>> TUMBLE(rowtime, INTERVAL '1' DAY)
>>
>> CREATE TABLE test_content_metrics (
>>dt STRING NOT NULL,
>>`body` ROW(
>>`platform_id` BIGINT,
>>`content_id` STRING
>>)
>> ) PARTITIONED BY (dt) WITH (
>>'connector' = 'filesystem',
>>'path' = 'etl/test_content_metrics',
>>'format' = 'json',
>> )
>>
>> INSERT INTO `test_content_metrics`
>> SELECT * FROM `test_content_metrics_view`
>>
>>
>>


Physical memory and Auto Memory settings in TM

2022-11-23 Thread ramkrishna vasudevan
Hi All,

Installation of flink clusters where we govern the memory configs by
specifying 'taskmanager.memory.process.size' TM memory is auto calculated.

But in this calculation we generally don't check the actual physical memory
available. So in case of a wrong configuration where the JVM heap size
calculation is below the physical memory but the direct memory goes beyond
the physical memory we don't fail the process creation but allow the
process to be started.

This is a naive question but just wanted to know if it is by design. The
reason is that it actually misleads the user as the UI shows the calculated
value but in reality memory crunch will lead to OOME at some point. Should
we fail the process startup in such cases?

Regards
Ram


Re: Weird Flink SQL error

2022-11-23 Thread Dan Hill
I upgraded to Flink v1.16.0 and I get the same error.

On Wed, Nov 23, 2022 at 9:47 AM Dan Hill  wrote:

> For the error `Encountered "." at line 1, column 119.`, here are the
> confusing parts:
>
> 1. The error happens when I executed the last part of the sql query:
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
> 2. Line 1 column 119 doesn't exist in that SQL statement.
> 3. None of the SQL that I've written has a period "." in it.
>
>
>
> On Wed, Nov 23, 2022 at 8:32 AM Dan Hill  wrote:
>
>> I'm using Flink 1.14.4
>>
>> On Wed, Nov 23, 2022, 02:28 yuxia  wrote:
>>
>>> Hi, Dan.
>>> I'm wondering what type of error you expect. IMO, I think most engines
>>> throw parse error in such way which tell you encounter an unexpected token.
>>>
>>> Best regards,
>>> Yuxia
>>>
>>> --
>>> *发件人: *"Dan Hill" 
>>> *收件人: *"User" 
>>> *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
>>> *主题: *Weird Flink SQL error
>>>
>>> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way
>>> to get better errors for Flink SQL?  I'm hitting it when I wrap some of the
>>> fields on an inner Row.
>>>
>>>
>>> *Works*
>>>
>>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>>> SELECT
>>> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
>>> platform_id,
>>> content_id
>>> FROM content_event
>>> GROUP BY
>>> platform_id,
>>> content_id,
>>> TUMBLE(rowtime, INTERVAL '1' DAY)
>>>
>>> CREATE TABLE test_content_metrics (
>>>dt STRING NOT NULL,
>>>`platform_id` BIGINT,
>>>`content_id` STRING
>>> ) PARTITIONED BY (dt) WITH (
>>>'connector' = 'filesystem',
>>>'path' = 'etl/test_content_metrics',
>>>'format' = 'json',
>>> )
>>>
>>> INSERT INTO `test_content_metrics`
>>> SELECT * FROM `test_content_metrics_view`
>>>
>>>
>>> *Fails*
>>>
>>> Wrapping a couple parameters in a Row causes the following exception.
>>>
>>>  Caused by: org.apache.flink.sql.parser.impl.ParseException: 
>>> Encountered "." at line 1, column 119.
>>> Was expecting one of:
>>> ")" ...
>>> "," ...
>>>
>>>
>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>>>
>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>>>
>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>>>
>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>>>
>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>>>[...]
>>>
>>>
>>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>>> SELECT
>>> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
>>> ROW(
>>> platform_id,
>>> content_id
>>> )
>>> FROM content_event
>>> GROUP BY
>>> platform_id,
>>> content_id,
>>> TUMBLE(rowtime, INTERVAL '1' DAY)
>>>
>>> CREATE TABLE test_content_metrics (
>>>dt STRING NOT NULL,
>>>`body` ROW(
>>>`platform_id` BIGINT,
>>>`content_id` STRING
>>>)
>>> ) PARTITIONED BY (dt) WITH (
>>>'connector' = 'filesystem',
>>>'path' = 'etl/test_content_metrics',
>>>'format' = 'json',
>>> )
>>>
>>> INSERT INTO `test_content_metrics`
>>> SELECT * FROM `test_content_metrics_view`
>>>
>>>
>>>


Re: Weird Flink SQL error

2022-11-23 Thread Dan Hill
Looks related to this issue.
https://lists.apache.org/thread/1sb5bos6tjv39fh0wjkvmvht0824r4my

In my case, it doesn't seem like it's a sink issue.  Even if I change my
minicluster test to SELECT * it, it fails the same way.

CREATE TEMPORARY VIEW `test_content_metrics_view` AS
SELECT
DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
ROW(
platform_id,
content_id
)
FROM content_event
GROUP BY
platform_id,
content_id,
TUMBLE(rowtime, INTERVAL '1' DAY)

SELECT * FROM test_content_metrics_view



On Wed, Nov 23, 2022 at 1:19 PM Dan Hill  wrote:

> I upgraded to Flink v1.16.0 and I get the same error.
>
> On Wed, Nov 23, 2022 at 9:47 AM Dan Hill  wrote:
>
>> For the error `Encountered "." at line 1, column 119.`, here are the
>> confusing parts:
>>
>> 1. The error happens when I executed the last part of the sql query:
>>
>> INSERT INTO `test_content_metrics`
>> SELECT * FROM `test_content_metrics_view`
>>
>> 2. Line 1 column 119 doesn't exist in that SQL statement.
>> 3. None of the SQL that I've written has a period "." in it.
>>
>>
>>
>> On Wed, Nov 23, 2022 at 8:32 AM Dan Hill  wrote:
>>
>>> I'm using Flink 1.14.4
>>>
>>> On Wed, Nov 23, 2022, 02:28 yuxia  wrote:
>>>
 Hi, Dan.
 I'm wondering what type of error you expect. IMO, I think most engines
 throw parse error in such way which tell you encounter an unexpected token.

 Best regards,
 Yuxia

 --
 *发件人: *"Dan Hill" 
 *收件人: *"User" 
 *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
 *主题: *Weird Flink SQL error

 Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way
 to get better errors for Flink SQL?  I'm hitting it when I wrap some of the
 fields on an inner Row.


 *Works*

 CREATE TEMPORARY VIEW `test_content_metrics_view` AS
 SELECT
 DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
 platform_id,
 content_id
 FROM content_event
 GROUP BY
 platform_id,
 content_id,
 TUMBLE(rowtime, INTERVAL '1' DAY)

 CREATE TABLE test_content_metrics (
dt STRING NOT NULL,
`platform_id` BIGINT,
`content_id` STRING
 ) PARTITIONED BY (dt) WITH (
'connector' = 'filesystem',
'path' = 'etl/test_content_metrics',
'format' = 'json',
 )

 INSERT INTO `test_content_metrics`
 SELECT * FROM `test_content_metrics_view`


 *Fails*

 Wrapping a couple parameters in a Row causes the following exception.

  Caused by: org.apache.flink.sql.parser.impl.ParseException: 
 Encountered "." at line 1, column 119.
 Was expecting one of:
 ")" ...
 "," ...


 org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)

 org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)

 org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)

 org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)

 org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
[...]


 CREATE TEMPORARY VIEW `test_content_metrics_view` AS
 SELECT
 DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
 ROW(
 platform_id,
 content_id
 )
 FROM content_event
 GROUP BY
 platform_id,
 content_id,
 TUMBLE(rowtime, INTERVAL '1' DAY)

 CREATE TABLE test_content_metrics (
dt STRING NOT NULL,
`body` ROW(
`platform_id` BIGINT,
`content_id` STRING
)
 ) PARTITIONED BY (dt) WITH (
'connector' = 'filesystem',
'path' = 'etl/test_content_metrics',
'format' = 'json',
 )

 INSERT INTO `test_content_metrics`
 SELECT * FROM `test_content_metrics_view`





Re: Weird Flink SQL error

2022-11-23 Thread Dan Hill
If I remove the "TEMPORARY VIEW" and just inline the SQL, this works fine.
This seems like a bug with temporary views.

On Wed, Nov 23, 2022 at 1:38 PM Dan Hill  wrote:

> Looks related to this issue.
> https://lists.apache.org/thread/1sb5bos6tjv39fh0wjkvmvht0824r4my
>
> In my case, it doesn't seem like it's a sink issue.  Even if I change my
> minicluster test to SELECT * it, it fails the same way.
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> ROW(
> platform_id,
> content_id
> )
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> SELECT * FROM test_content_metrics_view
>
>
>
> On Wed, Nov 23, 2022 at 1:19 PM Dan Hill  wrote:
>
>> I upgraded to Flink v1.16.0 and I get the same error.
>>
>> On Wed, Nov 23, 2022 at 9:47 AM Dan Hill  wrote:
>>
>>> For the error `Encountered "." at line 1, column 119.`, here are the
>>> confusing parts:
>>>
>>> 1. The error happens when I executed the last part of the sql query:
>>>
>>> INSERT INTO `test_content_metrics`
>>> SELECT * FROM `test_content_metrics_view`
>>>
>>> 2. Line 1 column 119 doesn't exist in that SQL statement.
>>> 3. None of the SQL that I've written has a period "." in it.
>>>
>>>
>>>
>>> On Wed, Nov 23, 2022 at 8:32 AM Dan Hill  wrote:
>>>
 I'm using Flink 1.14.4

 On Wed, Nov 23, 2022, 02:28 yuxia  wrote:

> Hi, Dan.
> I'm wondering what type of error you expect. IMO, I think most engines
> throw parse error in such way which tell you encounter an unexpected 
> token.
>
> Best regards,
> Yuxia
>
> --
> *发件人: *"Dan Hill" 
> *收件人: *"User" 
> *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
> *主题: *Weird Flink SQL error
>
> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a way
> to get better errors for Flink SQL?  I'm hitting it when I wrap some of 
> the
> fields on an inner Row.
>
>
> *Works*
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> platform_id,
> content_id
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`platform_id` BIGINT,
>`content_id` STRING
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>
> *Fails*
>
> Wrapping a couple parameters in a Row causes the following exception.
>
>  Caused by: org.apache.flink.sql.parser.impl.ParseException: 
> Encountered "." at line 1, column 119.
> Was expecting one of:
> ")" ...
> "," ...
>
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>
> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>[...]
>
>
> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
> SELECT
> DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), '-MM-dd'),
> ROW(
> platform_id,
> content_id
> )
> FROM content_event
> GROUP BY
> platform_id,
> content_id,
> TUMBLE(rowtime, INTERVAL '1' DAY)
>
> CREATE TABLE test_content_metrics (
>dt STRING NOT NULL,
>`body` ROW(
>`platform_id` BIGINT,
>`content_id` STRING
>)
> ) PARTITIONED BY (dt) WITH (
>'connector' = 'filesystem',
>'path' = 'etl/test_content_metrics',
>'format' = 'json',
> )
>
> INSERT INTO `test_content_metrics`
> SELECT * FROM `test_content_metrics_view`
>
>
>


Flink 1.16.0: java.lang.NoSuchMethodException: org.apache.flink.metrics.prometheus.PrometheusReporter.()

2022-11-23 Thread Clayton Wohl
When upgrading an application from Flink 1.14.6 to Flink 1.16.0, I get the
following exception:

ERROR org.apache.flink.runtime.metrics.ReporterSetup - Could not
instantiate metrics reporter prom. Metrics might not be exposed/reported.

java.lang.InstantiationException:
org.apache.flink.metrics.prometheus.PrometheusReporter

at java.lang.Class.newInstance(Unknown Source) ~[?:?]

at
org.apache.flink.runtime.metrics.ReporterSetup.loadViaReflection(ReporterSetup.java:467)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.metrics.ReporterSetup.loadReporter(ReporterSetup.java:409)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.metrics.ReporterSetup.setupReporters(ReporterSetup.java:328)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.metrics.ReporterSetup.fromConfiguration(ReporterSetup.java:209)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.startTaskManagerRunnerServices(TaskManagerRunner.java:223)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.start(TaskManagerRunner.java:288)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManager(TaskManagerRunner.java:481)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.lambda$runTaskManagerProcessSecurely$5(TaskManagerRunner.java:525)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerProcessSecurely(TaskManagerRunner.java:525)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerProcessSecurely(TaskManagerRunner.java:505)
~[flink-runtime-1.16.0.jar:1.16.0]

at
org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:463)
~[flink-runtime-1.16.0.jar:1.16.0]

Caused by: java.lang.NoSuchMethodException:
org.apache.flink.metrics.prometheus.PrometheusReporter.()

at java.lang.Class.getConstructor0(Unknown Source) ~[?:?]

... 13 more



Has the method mentioned been removed or changed in 1.16.0?


If it matters, I'm running this on Kubernetes with the Spotify Flink
Operator.


Apache vs Spotify Flink Operator?

2022-11-23 Thread Clayton Wohl
At my job, we are using the Spotify Flink Operator in production. Are there
any pros/cons of this Spotify Flink Operator versus the Apache Flink
Operator? We are particularly interested in the forthcoming autoscaling
functionality, but I understand that functionality isn't ready yet. Are
there any other advantages in the current version?

Apache Flink Operator 1.2.0 says it adds support for standalone deployment
mode, but still recommends native deployment mode. Which deployment mode
does the Spotify Flink Operator use? Is there any way I can see the
deployment mode from the GUI or the logs?


Re: Apache vs Spotify Flink Operator?

2022-11-23 Thread Őrhidi Mátyás
Hi Clayton,

We recently discussed these topics with Robert and Gyula in this
 podcast.

Cheers,
Matyas

On Wed, Nov 23, 2022 at 2:45 PM Clayton Wohl  wrote:

> At my job, we are using the Spotify Flink Operator in production. Are
> there any pros/cons of this Spotify Flink Operator versus the Apache Flink
> Operator? We are particularly interested in the forthcoming autoscaling
> functionality, but I understand that functionality isn't ready yet. Are
> there any other advantages in the current version?
>
> Apache Flink Operator 1.2.0 says it adds support for standalone deployment
> mode, but still recommends native deployment mode. Which deployment mode
> does the Spotify Flink Operator use? Is there any way I can see the
> deployment mode from the GUI or the logs?
>


Query about flink job manager dashboard

2022-11-23 Thread naga sudhakar
>
> Hi Team,
> Greetings!!!
> I am a software developer using apache flink and deploying flink jobs
> using the same. I have two queries about flink job manager dashboard. Can
> you please help with below?
>
> 1) is it possible to add login mechanism for the flink job manager dash
> board and have a role based mechanism for viewing running jobs, cancelling
> jobs, adding the jobs?
> 2) is it possible to disable to dash bord display but use api to do the
> same operations using API?
>
>
> Thanks,
> Nagasudhakar.
>


Re: Flink 1.16.0: java.lang.NoSuchMethodException: org.apache.flink.metrics.prometheus.PrometheusReporter.()

2022-11-23 Thread Clayton Wohl
I had to change this configuration:

metrics.reporter.prom.class:
"org.apache.flink.metrics.prometheus.PrometheusReporter"

to this for Flink 1.16:

metrics.reporter.prom.factory.class:
"org.apache.flink.metrics.prometheus.PrometheusReporterFactory"

Someone emailed me this fix directly. It works! Thank you :)

On Wed, Nov 23, 2022 at 4:39 PM Clayton Wohl  wrote:

> When upgrading an application from Flink 1.14.6 to Flink 1.16.0, I get the
> following exception:
>
> ERROR org.apache.flink.runtime.metrics.ReporterSetup - Could not
> instantiate metrics reporter prom. Metrics might not be exposed/reported.
>
> java.lang.InstantiationException:
> org.apache.flink.metrics.prometheus.PrometheusReporter
>
> at java.lang.Class.newInstance(Unknown Source) ~[?:?]
>
> at
> org.apache.flink.runtime.metrics.ReporterSetup.loadViaReflection(ReporterSetup.java:467)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.metrics.ReporterSetup.loadReporter(ReporterSetup.java:409)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.metrics.ReporterSetup.setupReporters(ReporterSetup.java:328)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.metrics.ReporterSetup.fromConfiguration(ReporterSetup.java:209)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.startTaskManagerRunnerServices(TaskManagerRunner.java:223)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.start(TaskManagerRunner.java:288)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManager(TaskManagerRunner.java:481)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.lambda$runTaskManagerProcessSecurely$5(TaskManagerRunner.java:525)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerProcessSecurely(TaskManagerRunner.java:525)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.runTaskManagerProcessSecurely(TaskManagerRunner.java:505)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> at
> org.apache.flink.runtime.taskexecutor.TaskManagerRunner.main(TaskManagerRunner.java:463)
> ~[flink-runtime-1.16.0.jar:1.16.0]
>
> Caused by: java.lang.NoSuchMethodException:
> org.apache.flink.metrics.prometheus.PrometheusReporter.()
>
> at java.lang.Class.getConstructor0(Unknown Source) ~[?:?]
>
> ... 13 more
>
>
>
> Has the method mentioned been removed or changed in 1.16.0?
>
>
> If it matters, I'm running this on Kubernetes with the Spotify Flink
> Operator.
>