[jira] [Created] (FLINK-35374) Flink 1.14 kafka connector Demo Error

2024-05-16 Thread hongxu han (Jira)
hongxu han created FLINK-35374:
--

 Summary: Flink 1.14 kafka connector Demo Error
 Key: FLINK-35374
 URL: https://issues.apache.org/jira/browse/FLINK-35374
 Project: Flink
  Issue Type: Bug
  Components: Documentation
Affects Versions: 1.14.4
Reporter: hongxu han
 Attachments: image-2024-05-16-16-14-52-414.png, 
image-2024-05-16-16-16-01-621.png

!image-2024-05-16-16-14-52-414.png|width=249,height=139!

It should be

!image-2024-05-16-16-16-01-621.png|width=486,height=100!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-35260) Translate "Watermark alignment "page into Chinese

2024-04-28 Thread hongxu han (Jira)
hongxu han created FLINK-35260:
--

 Summary: Translate "Watermark alignment "page into Chinese
 Key: FLINK-35260
 URL: https://issues.apache.org/jira/browse/FLINK-35260
 Project: Flink
  Issue Type: Improvement
  Components: chinese-translation, Documentation
Affects Versions: 1.19.0
Reporter: hongxu han


Watermark alignment lack of chinese translation



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-35252) Update the operators marked as deprecated in the instance program on the official website

2024-04-27 Thread hongxu han (Jira)
hongxu han created FLINK-35252:
--

 Summary: Update the operators marked as deprecated in the instance 
program on the official website
 Key: FLINK-35252
 URL: https://issues.apache.org/jira/browse/FLINK-35252
 Project: Flink
  Issue Type: Improvement
  Components: Documentation / Training / Exercises
Reporter: hongxu han
 Attachments: image-2024-04-28-10-05-37-671.png, 
image-2024-04-28-10-07-11-736.png, image-2024-04-28-10-07-36-248.png, 
image-2024-04-28-10-08-14-928.png, image-2024-04-28-10-09-32-184.png

Update the operators marked as deprecated in the instance program on the 
official website.

!image-2024-04-28-10-07-36-248.png|width=386,height=199!

!image-2024-04-28-10-08-14-928.png|width=448,height=82!

The recommended usage now is
Duration.ofSeconds(5)
!image-2024-04-28-10-09-32-184.png|width=474,height=78!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-35236) Flink 1.19 Translation error on the Chinese official website

2024-04-25 Thread hongxu han (Jira)
hongxu han created FLINK-35236:
--

 Summary: Flink 1.19 Translation error on the Chinese official 
website
 Key: FLINK-35236
 URL: https://issues.apache.org/jira/browse/FLINK-35236
 Project: Flink
  Issue Type: Bug
  Components: chinese-translation
Affects Versions: 1.19.0
Reporter: hongxu han


[https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/datastream/execution_mode/#order-of-processing]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-32958) Support VIEW as a source table in CREATE TABLE ... Like statement

2023-08-25 Thread Han (Jira)
Han created FLINK-32958:
---

 Summary: Support VIEW as a source table in CREATE TABLE ... Like 
statement
 Key: FLINK-32958
 URL: https://issues.apache.org/jira/browse/FLINK-32958
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / Planner
Affects Versions: 1.17.1
Reporter: Han


We can't create a table from a view through CREATE TABLE LIKE statement

 

case 1:
{code:sql}
create view source_view as select id,val from source;
create table sink with ('connector' = 'print') like source_view (excluding all);
insert into sink select * from source_view;{code}
case 2
{code:java}
DataStreamSource source = ...;
tEnv.createTemporaryView("source", source);
tEnv.executeSql("create table sink with ('connector' = 'print') like source 
(excluding all)");
tEnv.executeSql("insert into sink select * from source");{code}
 

The above cases will throw an exception:
{code:java}
Source table '`default_catalog`.`default_database`.`source`' of the LIKE clause 
can not be a VIEW{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-30708) Typo problem of the doc `Determinism In Continuous Queries`

2023-01-16 Thread Jie Han (Jira)
Jie Han created FLINK-30708:
---

 Summary: Typo problem of the doc `Determinism In Continuous 
Queries`
 Key: FLINK-30708
 URL: https://issues.apache.org/jira/browse/FLINK-30708
 Project: Flink
  Issue Type: Bug
  Components: Documentation
Reporter: Jie Han
 Attachments: 截屏2023-01-17 11.04.12.png

The url named `continuous query on dynamic tables` is written wrongly.

!截屏2023-01-17 11.04.12.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-26039) Incorrect value getter in map unnest table function

2022-02-08 Thread Han (Jira)
Han created FLINK-26039:
---

 Summary: Incorrect value getter in map unnest table function
 Key: FLINK-26039
 URL: https://issues.apache.org/jira/browse/FLINK-26039
 Project: Flink
  Issue Type: Bug
  Components: Table SQL / Runtime
Affects Versions: 1.14.3
Reporter: Han
 Fix For: 1.15.0


Suppose we have a map field that needs to be expanded.

 
{code:java}
CREATE TABLE t (
    id INT,
    map_field MAP
) WITH (
    -- ...
);

SELECT id, k, v FROM t, unnest(map_field) as A(k, v);{code}
 

 

We will get the following runtime exception:
{code:java}
Caused by: java.lang.ClassCastException: 
org.apache.flink.table.data.binary.BinaryStringData cannot be cast to 
java.lang.Integer
    at 
org.apache.flink.table.data.GenericRowData.getInt(GenericRowData.java:149)
    at 
org.apache.flink.table.data.utils.JoinedRowData.getInt(JoinedRowData.java:149)
    at 
org.apache.flink.table.data.RowData.lambda$createFieldGetter$245ca7d1$6(RowData.java:245)
    at 
org.apache.flink.table.data.RowData.lambda$createFieldGetter$25774257$1(RowData.java:296)
    at 
org.apache.flink.table.runtime.typeutils.RowDataSerializer.copyRowData(RowDataSerializer.java:170)
    at 
org.apache.flink.table.runtime.typeutils.RowDataSerializer.copy(RowDataSerializer.java:131)
    at 
org.apache.flink.table.runtime.typeutils.RowDataSerializer.copy(RowDataSerializer.java:48)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:80)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
    at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
    at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
    at 
org.apache.flink.table.runtime.util.StreamRecordCollector.collect(StreamRecordCollector.java:44)
    at 
org.apache.flink.table.runtime.collector.TableFunctionCollector.outputResult(TableFunctionCollector.java:68)
    at StreamExecCorrelate$10$TableFunctionCollector$4.collect(Unknown Source)
    at 
org.apache.flink.table.runtime.collector.WrappingCollector.outputResult(WrappingCollector.java:39)
    at 
StreamExecCorrelate$10$TableFunctionResultConverterCollector$8.collect(Unknown 
Source)
    at 
org.apache.flink.table.functions.TableFunction.collect(TableFunction.java:197)
    at 
org.apache.flink.table.runtime.functions.SqlUnnestUtils$MapUnnestTableFunction.eval(SqlUnnestUtils.java:169)
    at StreamExecCorrelate$10.processElement(Unknown Source)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:82)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:57)
    at 
org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29)
    at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:56)
    at 
org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:29)
 {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-25489) failed at task: node and npm install when building flink from source in mac m1

2021-12-30 Thread jie han (Jira)
jie han created FLINK-25489:
---

 Summary: failed at task: node and npm install when building flink 
from source in mac m1
 Key: FLINK-25489
 URL: https://issues.apache.org/jira/browse/FLINK-25489
 Project: Flink
  Issue Type: Bug
  Components: Build System
Affects Versions: 1.14.2
 Environment: macbook pro, apple m1 pro
Reporter: jie han


when I build flink from source code in my m1 mac, I met the problem: 

Failed to execute goal com.github.eirslett:frontend-maven-plugin with 
version:1.9.1

the error is: _Could not download Node.js: Got error code 404 from the server_

I found the solution in github:

[https://github.com/eirslett/frontend-maven-plugin/issues/952]

upgrade version of frontend-maven-plugin to 1.11.0 can fix it



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-21017) Fix missing backquote in table connectors docs

2021-01-18 Thread Han (Jira)
Han created FLINK-21017:
---

 Summary: Fix missing backquote in table connectors docs
 Key: FLINK-21017
 URL: https://issues.apache.org/jira/browse/FLINK-21017
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Affects Versions: 1.12.0
Reporter: Han
 Fix For: 1.13.0






--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (FLINK-20550) Wrong savepoint config in some docs

2020-12-09 Thread Han (Jira)
Han created FLINK-20550:
---

 Summary: Wrong savepoint config in some docs
 Key: FLINK-20550
 URL: https://issues.apache.org/jira/browse/FLINK-20550
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Affects Versions: 1.12.0, 1.13.0
Reporter: Han
 Fix For: 1.13.0


Fix config 'state.savepoint.dir' into 'state.savepoints.dir' in docs



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (FLINK-11595) Gelly addEdge in certain circumstances still include duplicate vertices.

2019-02-13 Thread Calvin Han (JIRA)
Calvin Han created FLINK-11595:
--

 Summary: Gelly addEdge in certain circumstances still include 
duplicate vertices.
 Key: FLINK-11595
 URL: https://issues.apache.org/jira/browse/FLINK-11595
 Project: Flink
  Issue Type: Bug
  Components: Gelly
Affects Versions: 1.7.1
 Environment: MacOS, intelliJ
Reporter: Calvin Han


Assuming a base graph constructed by:

```

public class GraphCorn {

 public static Graph gc;

 public GraphCorn(String filename) throws Exception {
 ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();

 DataSet> csvInput = 
env.readCsvFile(filename)
 .types(String.class, String.class, String.class, String.class, String.class, 
String.class);

 DataSet> srcTuples = csvInput.project(0, 2)
 .map(new MapFunction>() {
 @Override
 public Vertex map(Tuple tuple) throws Exception {
 VertexLabel lb = new VertexLabel(Util.hash(tuple.getField(1)));
 return new Vertex<>(tuple.getField(0), lb);
 }
 }).returns(new TypeHint>(){});

 DataSet> dstTuples = csvInput.project(1, 3)
 .map(new MapFunction>() {
 @Override
 public Vertex map(Tuple tuple) throws Exception {
 VertexLabel lb = new VertexLabel(Util.hash(tuple.getField(1)));
 return new Vertex<>(tuple.getField(0), lb);
 }
 }).returns(new TypeHint>(){});

 DataSet> vertexTuples = 
srcTuples.union(dstTuples).distinct(0);

 DataSet> edgeTuples = csvInput.project(0, 1, 4, 5)
 .map(new MapFunction>() {
 @Override
 public Edge map(Tuple tuple) throws Exception {
 EdgeLabel lb = new EdgeLabel(Util.hash(tuple.getField(2)), 
Long.parseLong(tuple.getField(3)));
 return new Edge<>(tuple.getField(0), tuple.getField(1), lb);
 }
 }).returns(new TypeHint>(){});

 this.gc = Graph.fromDataSet(vertexTuples, edgeTuples, env);
 }

}

```

Base graph CSV:

```

0,1,a,b,c,0
0,2,a,d,e,1
1,2,b,d,f,2

```

Attempt to add edges using the following function:

```

try(BufferedReader br = new BufferedReader(new FileReader(this.fileName))) {
 for(String line; (line = br.readLine()) != null; ) {
 String[] attributes = line.split(",");
 assert(attributes.length == 6);
 String srcID = attributes[0];
 String dstID = attributes[1];
 String srcLb = attributes[2];
 String dstLb = attributes[3];
 String edgeLb = attributes[4];
 String ts = attributes[5];

 Vertex src = new Vertex<>(srcID, new 
VertexLabel(Util.hash(srcLb)));
 Vertex dst = new Vertex<>(dstID, new 
VertexLabel(Util.hash(dstLb)));
 EdgeLabel edge = new EdgeLabel(Util.hash(edgeLb), Long.parseLong(ts));

 GraphCorn.gc = GraphCorn.gc.addEdge(src, dst, edge);
 }
} catch (Exception e) {
 System.err.println(e.getMessage());
}

```

The graph components to add is:

```

0,4,a,d,k,3
1,3,b,a,g,3
2,3,d,a,h,4

```

GraphCorn.gc will contain duplicate node 0, 1, and 2 (those that exist in base 
graph), which should not be the case acceding to the documentation.

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)