[ https://issues.apache.org/jira/browse/FLINK-19486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
appleyuchi updated FLINK-19486: ------------------------------- Description: *The reason why I reopen it is that:* *I can {color:#de350b}not{color} {color:#de350b}resume from checkpoint manually{color} when I stop the following program by force(maybe NOT supported officially?).* *①I {color:#de350b}hava subscribed to mailing list{color} and David Anderson haved answered this several days ago.* *But he did not tell me how I can fix it.* *I guess he's very busy.* *②My {color:#de350b}stackoverflow account{color} is forbidden to ask questions.* *③No one in alibaba {color:#de350b}Flink dingtalk group{color} can answer this question.* *④I have read [document|[https://ci.apache.org/projects/flink/flink-docs-stable/ops/state/checkpoints.html]] carefully,but I can NOT succeed in it.* *⑤{color:#de350b}apologise for post it here{color},but how can I make sure this question is a support instead of a bug or due to my own mistake?* *no successful example in Google/Baidu,am I right?* *⑥{color:#de350b}about less than 1000 people visit my [failure record |[https://yuchi.blog.csdn.net/article/details/108896441]]{color}about this experiment.* *Please help,thanks~* *The keypoint is:* *①Of course I know checkpoint is for automatically resuming,it's basics.* *②I want to stop the program by force(simulate Product Environment)with checkpoint and then resume from checkpoint manually* *I want to do an experiment of"incremental checkpoint"* my code is: [https://paste.ubuntu.com/p/DpTyQKq6Vk/] pom.xml is: <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 [http://maven.apache.org/xsd/maven-4.0.0.xsd]"> <modelVersion>4.0.0</modelVersion> <groupId>example</groupId> <artifactId>datastream_api</artifactId> <version>1.0-SNAPSHOT</version> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.1</version> <configuration> <source>1.8</source> <target>1.8</target> </configuration> </plugin> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <version>2.15.2</version> <executions> <execution> <goals> <goal>compile</goal> <goal>testCompile</goal> </goals> </execution> </executions> </plugin> </plugins> </build> <dependencies> <!-- [https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala] --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-scala_2.11</artifactId> <version>1.11.1</version> <!--<scope>provided</scope>--> </dependency> <!--<dependency>--> <!--<groupId>org.apache.flink</groupId>--> <!--<artifactId>flink-streaming-java_2.12</artifactId>--> <!--<version>1.11.1</version>--> <!--<!–<scope>compile</scope>–>--> <!--</dependency>--> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-statebackend-rocksdb_2.11</artifactId> <version>1.11.2</version> <!--<scope>test</scope>--> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>3.3.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>1.11.1</version> </dependency> <!--<dependency>--> <!--<groupId>org.slf4j</groupId>--> <!--<artifactId>slf4j-simple</artifactId>--> <!--<version>1.7.25</version>--> <!--<scope>compile</scope>--> <!--</dependency>--> <!-- [https://mvnrepository.com/artifact/org.apache.flink/flink-cep] --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-cep_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-cep-scala_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-scala_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.4</version> <!--<scope>provided</scope>--> </dependency> </dependencies> </project> the error I got is: [https://paste.ubuntu.com/p/49HRYXFzR2/] *some of the above error is:* *Caused by: java.lang.IllegalStateException: Unexpected state handle type, expected: class org.apache.flink.runtime.state.KeyGroupsStateHandle, but found: class org.apache.flink.runtime.state.IncrementalRemoteKeyedStateHandle* The steps are: 1.mvn clean scala:compile compile package 2.nc -lk 9999 3.flink run -c wordcount_increstate datastream_api-1.0-SNAPSHOT.jar Job has been submitted with JobID df6d62a43620f258155b8538ef0ddf1b 4.input the following conents in nc -lk 9999 before error error error error 5. flink run -s hdfs://Desktop:9000/tmp/flinkck/df6d62a43620f258155b8538ef0ddf1b/chk-22 -c StateWordCount datastream_api-1.0-SNAPSHOT.jar Then the above error happens. Please help,Thanks~! was: *The reason why I reopen it is that:* *I can {color:#de350b}not{color} {color:#de350b}resume from checkpoint manually{color} when I stop the following program by force(maybe NOT supported officially?).* *①I {color:#de350b}hava subscribed to mailing list{color} and David Anderson haved answered this several days ago.* *But he did not tell me how I can fix it.* *I guess he's very busy.* *②My {color:#de350b}stackoverflow account{color} is forbidden to ask questions.* *③No one in alibaba {color:#de350b}Flink dingtalk group{color} can answer this question.* *④I have read [document|[https://ci.apache.org/projects/flink/flink-docs-stable/ops/state/checkpoints.html]] carefully,but I can NOT succeed in it.* *⑤{color:#de350b}apologise for post here{color},but how can I make sure this question is a support instead of a bug or due to my own mistake?* *no successful example in Google/Baidu,am I right?* *Please help,thanks~* *The keypoint is:* *①Of course I know checkpoint is for automatically resuming,it's basics.* *②I want to stop the program by force(simulate Product Environment)with checkpoint and then resume from checkpoint manually* *I want to do an experiment of"incremental checkpoint"* my code is: [https://paste.ubuntu.com/p/DpTyQKq6Vk/] pom.xml is: <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 [http://maven.apache.org/xsd/maven-4.0.0.xsd]"> <modelVersion>4.0.0</modelVersion> <groupId>example</groupId> <artifactId>datastream_api</artifactId> <version>1.0-SNAPSHOT</version> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.1</version> <configuration> <source>1.8</source> <target>1.8</target> </configuration> </plugin> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <version>2.15.2</version> <executions> <execution> <goals> <goal>compile</goal> <goal>testCompile</goal> </goals> </execution> </executions> </plugin> </plugins> </build> <dependencies> <!-- [https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala] --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-scala_2.11</artifactId> <version>1.11.1</version> <!--<scope>provided</scope>--> </dependency> <!--<dependency>--> <!--<groupId>org.apache.flink</groupId>--> <!--<artifactId>flink-streaming-java_2.12</artifactId>--> <!--<version>1.11.1</version>--> <!--<!–<scope>compile</scope>–>--> <!--</dependency>--> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-statebackend-rocksdb_2.11</artifactId> <version>1.11.2</version> <!--<scope>test</scope>--> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>3.3.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>1.11.1</version> </dependency> <!--<dependency>--> <!--<groupId>org.slf4j</groupId>--> <!--<artifactId>slf4j-simple</artifactId>--> <!--<version>1.7.25</version>--> <!--<scope>compile</scope>--> <!--</dependency>--> <!-- [https://mvnrepository.com/artifact/org.apache.flink/flink-cep] --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-cep_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-cep-scala_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-scala_2.11</artifactId> <version>1.11.1</version> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.4</version> <!--<scope>provided</scope>--> </dependency> </dependencies> </project> the error I got is: [https://paste.ubuntu.com/p/49HRYXFzR2/] *some of the above error is:* *Caused by: java.lang.IllegalStateException: Unexpected state handle type, expected: class org.apache.flink.runtime.state.KeyGroupsStateHandle, but found: class org.apache.flink.runtime.state.IncrementalRemoteKeyedStateHandle* The steps are: 1.mvn clean scala:compile compile package 2.nc -lk 9999 3.flink run -c wordcount_increstate datastream_api-1.0-SNAPSHOT.jar Job has been submitted with JobID df6d62a43620f258155b8538ef0ddf1b 4.input the following conents in nc -lk 9999 before error error error error 5. flink run -s hdfs://Desktop:9000/tmp/flinkck/df6d62a43620f258155b8538ef0ddf1b/chk-22 -c StateWordCount datastream_api-1.0-SNAPSHOT.jar Then the above error happens. Please help,Thanks~! > expected: class org.apache.flink.runtime.state.KeyGroupsStateHandle, but > found: class org.apache.flink.runtime.state.IncrementalRemoteKeyedStateHandle > ------------------------------------------------------------------------------------------------------------------------------------------------------ > > Key: FLINK-19486 > URL: https://issues.apache.org/jira/browse/FLINK-19486 > Project: Flink > Issue Type: Bug > Affects Versions: 1.11.1 > Reporter: appleyuchi > Priority: Minor > > *The reason why I reopen it is that:* > > *I can {color:#de350b}not{color} {color:#de350b}resume from checkpoint > manually{color} when I stop the following program by force(maybe NOT > supported officially?).* > *①I {color:#de350b}hava subscribed to mailing list{color} and David Anderson > haved answered this several days ago.* > *But he did not tell me how I can fix it.* > *I guess he's very busy.* > *②My {color:#de350b}stackoverflow account{color} is forbidden to ask > questions.* > *③No one in alibaba {color:#de350b}Flink dingtalk group{color} can answer > this question.* > *④I have read > [document|[https://ci.apache.org/projects/flink/flink-docs-stable/ops/state/checkpoints.html]] > carefully,but I can NOT succeed in it.* > *⑤{color:#de350b}apologise for post it here{color},but how can I make sure > this question is a support instead of a bug or due to my own mistake?* > *no successful example in Google/Baidu,am I right?* > *⑥{color:#de350b}about less than 1000 people visit my [failure record > |[https://yuchi.blog.csdn.net/article/details/108896441]]{color}about this > experiment.* > *Please help,thanks~* > > *The keypoint is:* > *①Of course I know checkpoint is for automatically resuming,it's basics.* > *②I want to stop the program by force(simulate Product Environment)with > checkpoint and then resume from checkpoint manually* > > > *I want to do an experiment of"incremental checkpoint"* > my code is: > [https://paste.ubuntu.com/p/DpTyQKq6Vk/] > > pom.xml is: > <?xml version="1.0" encoding="UTF-8"?> > <project xmlns="http://maven.apache.org/POM/4.0.0" > xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" > xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 > [http://maven.apache.org/xsd/maven-4.0.0.xsd]"> > <modelVersion>4.0.0</modelVersion> > <groupId>example</groupId> > <artifactId>datastream_api</artifactId> > <version>1.0-SNAPSHOT</version> > <build> > <plugins> > <plugin> > <groupId>org.apache.maven.plugins</groupId> > <artifactId>maven-compiler-plugin</artifactId> > <version>3.1</version> > <configuration> > <source>1.8</source> > <target>1.8</target> > </configuration> > </plugin> > <plugin> > <groupId>org.scala-tools</groupId> > <artifactId>maven-scala-plugin</artifactId> > <version>2.15.2</version> > <executions> > <execution> > <goals> > <goal>compile</goal> > <goal>testCompile</goal> > </goals> > </execution> > </executions> > </plugin> > > </plugins> > </build> > <dependencies> > <!-- > [https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala] > --> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-streaming-scala_2.11</artifactId> > <version>1.11.1</version> > <!--<scope>provided</scope>--> > </dependency> > <!--<dependency>--> > <!--<groupId>org.apache.flink</groupId>--> > <!--<artifactId>flink-streaming-java_2.12</artifactId>--> > <!--<version>1.11.1</version>--> > <!--<!–<scope>compile</scope>–>--> > <!--</dependency>--> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-clients_2.11</artifactId> > <version>1.11.1</version> > </dependency> > > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-statebackend-rocksdb_2.11</artifactId> > <version>1.11.2</version> > <!--<scope>test</scope>--> > </dependency> > <dependency> > <groupId>org.apache.hadoop</groupId> > <artifactId>hadoop-client</artifactId> > <version>3.3.0</version> > </dependency> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-core</artifactId> > <version>1.11.1</version> > </dependency> > <!--<dependency>--> > <!--<groupId>org.slf4j</groupId>--> > <!--<artifactId>slf4j-simple</artifactId>--> > <!--<version>1.7.25</version>--> > <!--<scope>compile</scope>--> > <!--</dependency>--> > <!-- [https://mvnrepository.com/artifact/org.apache.flink/flink-cep] --> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-cep_2.11</artifactId> > <version>1.11.1</version> > </dependency> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-cep-scala_2.11</artifactId> > <version>1.11.1</version> > </dependency> > <dependency> > <groupId>org.apache.flink</groupId> > <artifactId>flink-scala_2.11</artifactId> > <version>1.11.1</version> > </dependency> > > <dependency> > <groupId>org.projectlombok</groupId> > <artifactId>lombok</artifactId> > <version>1.18.4</version> > <!--<scope>provided</scope>--> > </dependency> > </dependencies> > </project> > > the error I got is: > [https://paste.ubuntu.com/p/49HRYXFzR2/] > > *some of the above error is:* > *Caused by: java.lang.IllegalStateException: Unexpected state handle type, > expected: class org.apache.flink.runtime.state.KeyGroupsStateHandle, but > found: class org.apache.flink.runtime.state.IncrementalRemoteKeyedStateHandle* > > > The steps are: > 1.mvn clean scala:compile compile package > 2.nc -lk 9999 > 3.flink run -c wordcount_increstate datastream_api-1.0-SNAPSHOT.jar > Job has been submitted with JobID df6d62a43620f258155b8538ef0ddf1b > 4.input the following conents in nc -lk 9999 > before > error > error > error > error > 5. > flink run -s > hdfs://Desktop:9000/tmp/flinkck/df6d62a43620f258155b8538ef0ddf1b/chk-22 -c > StateWordCount datastream_api-1.0-SNAPSHOT.jar > Then the above error happens. > > Please help,Thanks~! > -- This message was sent by Atlassian Jira (v8.3.4#803005)