[jira] [Assigned] (FLINK-11968) Fix runtime SingleElementIterator.iterator and remove table.SingleElementIterator

2019-03-20 Thread Alexander Chermenin (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-11968?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-11968:
---

Assignee: Alexander Chermenin

> Fix runtime SingleElementIterator.iterator and remove 
> table.SingleElementIterator
> -
>
> Key: FLINK-11968
> URL: https://issues.apache.org/jira/browse/FLINK-11968
> Project: Flink
>  Issue Type: Bug
>  Components: Runtime / Operators
>Reporter: Jingsong Lee
>Assignee: Alexander Chermenin
>Priority: Major
>
> {code:java}
> @Override
> public Iterator iterator() {
>return this;
> }
> {code}
> In iterator we need set available to true otherwise we can only iterator once.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (FLINK-4641) Support branching CEP patterns

2017-06-15 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16050551#comment-16050551
 ] 

Alexander Chermenin commented on FLINK-4641:


Hi [~dian.fu]! Yes of course, you're welcome! Unfortunately I don't have enough 
time to do it.

> Support branching CEP patterns 
> ---
>
> Key: FLINK-4641
> URL: https://issues.apache.org/jira/browse/FLINK-4641
> Project: Flink
>  Issue Type: Improvement
>  Components: CEP
>Reporter: Till Rohrmann
>Assignee: Alexander Chermenin
>
> We should add support for branching CEP patterns to the Pattern API. 
> {code}
> |--> B --|
> ||
> A -- --> D
> ||
> |--> C --|
> {code}
> This feature will require changes to the {{Pattern}} class and the 
> {{NFACompiler}}.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Assigned] (FLINK-4565) Support for SQL IN operator

2017-02-03 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-4565?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-4565:
--

Assignee: (was: Alexander Chermenin)

> Support for SQL IN operator
> ---
>
> Key: FLINK-4565
> URL: https://issues.apache.org/jira/browse/FLINK-4565
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Timo Walther
>
> It seems that Flink SQL supports the uncorrelated sub-query IN operator. But 
> it should also be available in the Table API and tested.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (FLINK-4303) Add CEP examples

2017-01-24 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-4303?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-4303:
--

Assignee: Alexander Chermenin

> Add CEP examples
> 
>
> Key: FLINK-4303
> URL: https://issues.apache.org/jira/browse/FLINK-4303
> Project: Flink
>  Issue Type: Improvement
>  Components: CEP
>Affects Versions: 1.1.0
>Reporter: Timo Walther
>Assignee: Alexander Chermenin
>
> Neither CEP Java nor CEP Scala contain a runnable example. The example on the 
> website is also not runnable without adding some additional code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (FLINK-3617) NPE from CaseClassSerializer when dealing with null Option field

2017-01-10 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3617?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-3617:
--

Assignee: Alexander Chermenin

> NPE from CaseClassSerializer when dealing with null Option field
> 
>
> Key: FLINK-3617
> URL: https://issues.apache.org/jira/browse/FLINK-3617
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.0.0
>Reporter: Jamie Grier
>Assignee: Alexander Chermenin
>
> This error occurs when serializing a Scala case class with an field of 
> Option[] type where the value is not Some or None, but null.
> If this is not supported we should have a good error message.
> java.lang.RuntimeException: ConsumerThread threw an exception: null
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.run(FlinkKafkaConsumer09.java:336)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:78)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:224)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:81)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource$NonTimestampContext.collect(StreamSource.java:158)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09$ConsumerThread.run(FlinkKafkaConsumer09.java:473)
> Caused by: java.lang.NullPointerException
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.serialize(CaseClassSerializer.scala:100)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.serialize(CaseClassSerializer.scala:30)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.serialize(CaseClassSerializer.scala:100)
>   at 
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.serialize(CaseClassSerializer.scala:30)
>   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecordSerializer.serialize(StreamRecordSerializer.java:107)
>   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecordSerializer.serialize(StreamRecordSerializer.java:42)
>   at 
> org.apache.flink.runtime.plugable.SerializationDelegate.write(SerializationDelegate.java:56)
>   at 
> org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.addRecord(SpanningRecordSerializer.java:79)
>   at 
> org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:84)
>   at 
> org.apache.flink.streaming.runtime.io.StreamRecordWriter.emit(StreamRecordWriter.java:86)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:78)
>   ... 3 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (FLINK-4641) Support branching CEP patterns

2016-12-28 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-4641?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-4641:
--

Assignee: Alexander Chermenin

> Support branching CEP patterns 
> ---
>
> Key: FLINK-4641
> URL: https://issues.apache.org/jira/browse/FLINK-4641
> Project: Flink
>  Issue Type: Improvement
>  Components: CEP
>Reporter: Till Rohrmann
>Assignee: Alexander Chermenin
>
> We should add support for branching CEP patterns to the Pattern API. 
> {code}
> |--> B --|
> ||
> A -- --> D
> ||
> |--> C --|
> {code}
> This feature will require changes to the {{Pattern}} class and the 
> {{NFACompiler}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3615) Add support for non-native SQL types

2016-12-28 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3615?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15782410#comment-15782410
 ] 

Alexander Chermenin commented on FLINK-3615:


Hi all. Is it an actual issue or it has been solved in FLINK-3916?

> Add support for non-native SQL types
> 
>
> Key: FLINK-3615
> URL: https://issues.apache.org/jira/browse/FLINK-3615
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Reporter: Vasia Kalavri
>
> The TypeConverter of the Table API currently only supports basic types. We 
> should maybe re-design the way {{sqlTypeToTypeInfo}} works. It is used in the 
> {{CodeGenerator}} for visiting literals, in {{DataSetAggregate}} to create 
> the {{RowTypeInfo}} and in {{determineReturnType}}. We could maybe provide a 
> custom implementation per operator to determine the return type, based on the 
> input fields.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-14 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15748508#comment-15748508
 ] 

Alexander Chermenin commented on FLINK-5319:


Well, we can change the signature of {{map}} methods (and others) from 
{code}public  MapOperator map(MapFunction mapper){code} to 
{code}public  MapOperator map(MapFunction mapper){code} 
to make possible such code as next one:
{code}DataSet intDataSet = env.fromElements(1, 2, 3);
DataSet longDataSet = env.fromElements(1L, 2L, 3L);

MapFunction function = Number::doubleValue;

List intToDoubles = intDataSet.map(function).collect();
List longToDoubles = longDataSet.map(function).collect();{code}
What do you think about it?

> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>Assignee: Timo Walther
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>   at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
>   at 
> org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
>   at 

[jira] [Commented] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-14 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15748212#comment-15748212
 ] 

Alexander Chermenin commented on FLINK-5319:


Here is a link to the bug: https://bugs.openjdk.java.net/browse/JDK-8154236

> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>Assignee: Timo Walther
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>   at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
>   at 
> org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:61)
>   at 
> org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:32)
>   at 
> 

[jira] [Commented] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-14 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15747820#comment-15747820
 ] 

Alexander Chermenin commented on FLINK-5319:


It seems there is a Java bug. I used next piece of code and I've got the same 
result:
{code}package org.sample.flink.examples.mappers;

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;

class Task {

interface Mapper extends Serializable {
OUT map(IN value);
}

private byte[] bytes;
private IN input;

private Task(Mapper mapper, IN input) {
this.bytes = serializeObject(mapper);
this.input = input;
}

public static void main(String[] args) {
Task longTask = new Task<>(Number::doubleValue, 1L);
Task intTask = new Task<>(Number::doubleValue, 1);
System.out.println(longTask.exec());
System.out.println(intTask.exec());
}

private static Object deserializeObject(byte[] bytes) {
try (ObjectInputStream oois = new ObjectInputStream(new 
ByteArrayInputStream(bytes))) {
return oois.readObject();
} catch (IOException | ClassNotFoundException e) {
e.printStackTrace();
return null;
}
}

private static byte[] serializeObject(Object o) {
try (ByteArrayOutputStream baos = new ByteArrayOutputStream();
 ObjectOutputStream oos = new ObjectOutputStream(baos)) {
oos.writeObject(o);
oos.flush();
return baos.toByteArray();
} catch (IOException e) {
e.printStackTrace();
return new byte[0];
}
}

@SuppressWarnings("unchecked")
private OUT exec() {
Mapper mapper = (Mapper) deserializeObject(bytes);
return (OUT) mapper.map(input);
}
}{code}

Exception:
{code}Exception in thread "main" java.lang.ClassCastException: 
java.lang.Integer cannot be cast to java.lang.Long
at org.sample.flink.examples.mappers.Task.exec(Task.java:55)
at org.sample.flink.examples.mappers.Task.main(Task.java:28){code}

> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>Assignee: Timo Walther
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 

[jira] [Commented] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-12 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15742901#comment-15742901
 ] 

Alexander Chermenin commented on FLINK-5319:


More simple test with the same result:
{code}ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet longDataSet = env.fromCollection(Arrays.asList(1L, 2L, 3L, 4L, 
5L));
DataSet intDataSet = env.fromCollection(Arrays.asList(1, 2, 3, 4, 5));
longDataSet.map(Number::doubleValue).print();
intDataSet.map(Number::doubleValue).print();{code}


> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>   at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
>   at 
> org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> 

[jira] [Commented] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-12 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15742827#comment-15742827
 ] 

Alexander Chermenin commented on FLINK-5319:


There are no problems if use code either only with B or only with C.

> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>   at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
>   at 
> org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:61)
>   at 
> org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:32)
>   at 
> 

[jira] [Updated] (FLINK-5319) ClassCastException when reusing an inherited method as KeySelector for different classes

2016-12-12 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin updated FLINK-5319:
---
Description: 
Code sample:
{code}static abstract class A {
int id;
A(int id) {this.id = id; }
int getId() { return id; }
}

static class B extends A { B(int id) { super(id % 3); } }
static class C extends A { C(int id) { super(id % 2); } }

private static B b(int id) { return new B(id); }
private static C c(int id) { return new C(id); }

/**
 * Main method.
 */
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment environment =
StreamExecutionEnvironment.getExecutionEnvironment();

B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);

DataStreamSource bStream = environment.fromElements(bs);
DataStreamSource cStream = environment.fromElements(cs);

bStream.keyBy((KeySelector) A::getId).print();
cStream.keyBy((KeySelector) A::getId).print();

environment.execute();
}
{code}

This code throws next exception:
{code}Exception in thread "main" 
org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: Could not extract key from 
org.sample.flink.examples.Test$C@5e1a8111
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
at 
org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Could not extract key from 
org.sample.flink.examples.Test$C@5e1a8111
at 
org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:61)
at 
org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:32)
at 
org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:83)
at 
org.apache.flink.streaming.runtime.io.StreamRecordWriter.emit(StreamRecordWriter.java:86)
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:72)
... 11 more
Caused by: java.lang.ClassCastException: org.sample.flink.examples.Test$C 
cannot be cast to org.sample.flink.examples.Test$B
at 
org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:59)
... 15 more{code}

This problem occurs when we use method reference as KeySelector. And there are 
no problems when we use anonymous 

[jira] [Updated] (FLINK-5319) ClassCastException when reusing an inherited method reference as KeySelector for different classes

2016-12-12 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5319?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin updated FLINK-5319:
---
Summary: ClassCastException when reusing an inherited method reference as 
KeySelector for different classes  (was: ClassCastException when reusing an 
inherited method as KeySelector for different classes)

> ClassCastException when reusing an inherited method reference as KeySelector 
> for different classes
> --
>
> Key: FLINK-5319
> URL: https://issues.apache.org/jira/browse/FLINK-5319
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.2.0
>Reporter: Alexander Chermenin
>
> Code sample:
> {code}static abstract class A {
> int id;
> A(int id) {this.id = id; }
> int getId() { return id; }
> }
> static class B extends A { B(int id) { super(id % 3); } }
> static class C extends A { C(int id) { super(id % 2); } }
> private static B b(int id) { return new B(id); }
> private static C c(int id) { return new C(id); }
> /**
>  * Main method.
>  */
> public static void main(String[] args) throws Exception {
> StreamExecutionEnvironment environment =
> StreamExecutionEnvironment.getExecutionEnvironment();
> B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
> C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);
> DataStreamSource bStream = environment.fromElements(bs);
> DataStreamSource cStream = environment.fromElements(cs);
> bStream.keyBy((KeySelector) A::getId).print();
> cStream.keyBy((KeySelector) A::getId).print();
> environment.execute();
> }
> {code}
> This code throws next exception:
> {code}Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>   at 
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>   at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
>   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
>   at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>   at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
>   at 
> org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
>   at 
> org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
>   at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
>   at 
> org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
>   at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
>   at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not extract key from 
> org.sample.flink.examples.Test$C@5e1a8111
>   at 
> org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:61)
>   at 
> 

[jira] [Created] (FLINK-5319) ClassCastException when reusing an inherited method as KeySelector for different classes

2016-12-12 Thread Alexander Chermenin (JIRA)
Alexander Chermenin created FLINK-5319:
--

 Summary: ClassCastException when reusing an inherited method as 
KeySelector for different classes
 Key: FLINK-5319
 URL: https://issues.apache.org/jira/browse/FLINK-5319
 Project: Flink
  Issue Type: Bug
  Components: Core
Affects Versions: 1.2.0
Reporter: Alexander Chermenin


Code sample:
{code}static abstract class A {
int id;
A(int id) {this.id = id; }
int getId() { return id; }
}

static class B extends A { B(int id) { super(id % 3); } }
static class C extends A { C(int id) { super(id % 2); } }

private static B b(int id) { return new B(id); }
private static C c(int id) { return new C(id); }

/**
 * Main method.
 */
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment environment =
StreamExecutionEnvironment.getExecutionEnvironment();

B[] bs = IntStream.range(0, 10).mapToObj(Test::b).toArray(B[]::new);
C[] cs = IntStream.range(0, 10).mapToObj(Test::c).toArray(C[]::new);

DataStreamSource bStream = environment.fromElements(bs);
DataStreamSource cStream = environment.fromElements(cs);

bStream.keyBy((KeySelector) A::getId).print();
cStream.keyBy((KeySelector) A::getId).print();

environment.execute();
}
{code}

This code throws next exception:
{code}Exception in thread "main" 
org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:901)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:844)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: Could not extract key from 
org.sample.flink.examples.Test$C@5e1a8111
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:75)
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:39)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:746)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:724)
at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:84)
at 
org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:127)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:75)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:56)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:269)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Could not extract key from 
org.sample.flink.examples.Test$C@5e1a8111
at 
org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:61)
at 
org.apache.flink.streaming.runtime.partitioner.KeyGroupStreamPartitioner.selectChannels(KeyGroupStreamPartitioner.java:32)
at 
org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:83)
at 
org.apache.flink.streaming.runtime.io.StreamRecordWriter.emit(StreamRecordWriter.java:86)
at 
org.apache.flink.streaming.runtime.io.RecordWriterOutput.collect(RecordWriterOutput.java:72)
... 11 more
Caused by: java.lang.ClassCastException: org.sample.flink.examples.Test$C 
cannot be cast to org.sample.flink.examples.Test$B
at 

[jira] [Created] (FLINK-5303) Add CUBE/ROLLUP/GROUPING SETS operator in SQL

2016-12-08 Thread Alexander Chermenin (JIRA)
Alexander Chermenin created FLINK-5303:
--

 Summary: Add CUBE/ROLLUP/GROUPING SETS operator in SQL
 Key: FLINK-5303
 URL: https://issues.apache.org/jira/browse/FLINK-5303
 Project: Flink
  Issue Type: New Feature
  Components: Documentation, Table API & SQL
Reporter: Alexander Chermenin
Assignee: Alexander Chermenin


Add support for such operators as CUBE, ROLLUP and GROUPING SETS in SQL.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (FLINK-2980) Add CUBE/ROLLUP/GROUPING SETS operator in Table API.

2016-12-06 Thread Alexander Chermenin (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-2980?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alexander Chermenin reassigned FLINK-2980:
--

Assignee: Alexander Chermenin

> Add CUBE/ROLLUP/GROUPING SETS operator in Table API.
> 
>
> Key: FLINK-2980
> URL: https://issues.apache.org/jira/browse/FLINK-2980
> Project: Flink
>  Issue Type: New Feature
>  Components: Documentation, Table API & SQL
>Reporter: Chengxiang Li
>Assignee: Alexander Chermenin
> Attachments: Cube-Rollup-GroupSet design doc in Flink.pdf
>
>
> Computing aggregates over a cube/rollup/grouping sets of several dimensions 
> is a common operation in data warehousing. It would be nice to have them in 
> Table API.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4303) Add CEP examples

2016-12-02 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4303?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15714936#comment-15714936
 ] 

Alexander Chermenin commented on FLINK-4303:


Hi all. And what about using this [~till.rohrmann]'s example: 
https://github.com/tillrohrmann/cep-monitoring ? It may be adapted a bit and I 
can rewrite it via Scala as example.

> Add CEP examples
> 
>
> Key: FLINK-4303
> URL: https://issues.apache.org/jira/browse/FLINK-4303
> Project: Flink
>  Issue Type: Improvement
>  Components: CEP
>Affects Versions: 1.1.0
>Reporter: Timo Walther
>
> Neither CEP Java nor CEP Scala contain a runnable example. The example on the 
> website is also not runnable without adding some additional code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4631) NullPointerException during stream task cleanup

2016-11-11 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4631?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15656726#comment-15656726
 ] 

Alexander Chermenin commented on FLINK-4631:


I think this ticket must be reopened because there only one of many cases was 
solved.
If it possible I can check other elements of streaming job in the same PR.

> NullPointerException during stream task cleanup
> ---
>
> Key: FLINK-4631
> URL: https://issues.apache.org/jira/browse/FLINK-4631
> Project: Flink
>  Issue Type: Bug
>Affects Versions: 1.1.2
> Environment: Ubuntu server 12.04.5 64 bit
> java version "1.8.0_40"
> Java(TM) SE Runtime Environment (build 1.8.0_40-b26)
> Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
>Reporter: Avihai Berkovitz
> Fix For: 1.2.0
>
>
> If a streaming job failed during startup (in my case, due to lack of network 
> buffers), all the tasks are being cancelled before they started. This causes 
> many instances of the following exception:
> {noformat}
> 2016-09-18 14:17:12,177 ERROR 
> org.apache.flink.streaming.runtime.tasks.StreamTask   - Error during 
> cleanup of stream task
> java.lang.NullPointerException
>   at 
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.cleanup(OneInputStreamTask.java:73)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:323)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>   at java.lang.Thread.run(Thread.java:745)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-2608) Arrays.asList(..) does not work with CollectionInputFormat

2016-08-25 Thread Alexander Chermenin (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-2608?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15437141#comment-15437141
 ] 

Alexander Chermenin commented on FLINK-2608:


This is a bug in ArraysAsListSerializer from Twitter Chill 
(https://github.com/twitter/chill/issues/255).
It will be fixed after building next release and changing dependency version 
for this one.

> Arrays.asList(..) does not work with CollectionInputFormat
> --
>
> Key: FLINK-2608
> URL: https://issues.apache.org/jira/browse/FLINK-2608
> Project: Flink
>  Issue Type: Bug
>  Components: Type Serialization System
>Affects Versions: 0.9, 0.10.0
>Reporter: Maximilian Michels
>Priority: Minor
> Fix For: 1.0.0
>
>
> When using Arrays.asList(..) as input for a CollectionInputFormat, the 
> serialization/deserialization fails when deploying the task.
> See the following program:
> {code:java}
> public class WordCountExample {
> public static void main(String[] args) throws Exception {
> final ExecutionEnvironment env =
> ExecutionEnvironment.getExecutionEnvironment();
> DataSet text = env.fromElements(
> "Who's there?",
> "I think I hear them. Stand, ho! Who's there?");
> // DOES NOT WORK
> List elements = Arrays.asList(0, 0, 0);
> // The following works:
> //List elements = new ArrayList<>(new int[] {0,0,0});
> DataSet set = env.fromElements(new TestClass(elements));
> DataSet> wordCounts = text
> .flatMap(new LineSplitter())
> .withBroadcastSet(set, "set")
> .groupBy(0)
> .sum(1);
> wordCounts.print();
> }
> public static class LineSplitter implements FlatMapFunction Tuple2> {
> @Override
> public void flatMap(String line, Collector Integer>> out) {
> for (String word : line.split(" ")) {
> out.collect(new Tuple2(word, 1));
> }
> }
> }
> public static class TestClass implements Serializable {
> private static final long serialVersionUID = -2932037991574118651L;
> List integerList;
> public TestClass(List integerList){
> this.integerList=integerList;
> }
> }
> {code}
> {noformat}
> Exception in thread "main" 
> org.apache.flink.runtime.client.JobExecutionException: Cannot initialize task 
> 'DataSource (at main(Test.java:32) 
> (org.apache.flink.api.java.io.CollectionInputFormat))': Deserializing the 
> InputFormat ([mytests.Test$TestClass@4d6025c5]) failed: unread block data
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:523)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$4.apply(JobManager.scala:507)
> at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at 
> org.apache.flink.runtime.jobmanager.JobManager.org$apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:507)
> at 
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:190)
> at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
> at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
> at 
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
> at 
> org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:43)
> at 
> org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:29)
> at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
> at 
> org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:29)
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at 
> org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:92)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
> at akka.dispatch.Mailbox.run(Mailbox.scala:221)
> at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at 
>