[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16290978#comment-16290978
 ] 

Ryan Brideau commented on FLINK-8256:
-

That works for me! Thanks again.

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
> at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
> at org.peopleinmotion.TestFunction.main(TestFunction.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect

[jira] [Comment Edited] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16290846#comment-16290846
 ] 

Ryan Brideau edited comment on FLINK-8256 at 12/14/17 1:45 PM:
---

Thanks for looking into this so quickly. I managed to track down the root of 
the issue on my end. I had built my project previously using the snapshot 
archetype, and not the newest 1.4.0 one:

{code:java}
mvn archetype:generate   \
  -DarchetypeGroupId=org.apache.flink  \
  -DarchetypeArtifactId=flink-quickstart-scala \
  -DarchetypeVersion=1.4-SNAPSHOT
{code}

To fix the problem I just built a new empty project using the 1.4.0 archetype 
version and did a diff of the pom.xml of the two, updating my existing one to 
match the new one, and now everything works perfectly. I suspect that anybody 
who made a project recently might find themselves in the same situation.


was (Author: brideau):
Thanks for looking into this so quickly. I managed to track down the root of 
the issue on my end. I had built my project previously using the snapshot 
archetype, and not the newest 1.4.0 one:

{code:java}
mvn archetype:generate   \
  -DarchetypeGroupId=org.apache.flink  \
  -DarchetypeArtifactId=flink-quickstart-scala \
  -DarchetypeVersion=1.4-SNAPSHOT
{code}

To fix the problem I just build a new empty project using the 1.4.0 archetype 
version and did a diff of the pom.xml of the two, updating my existing one to 
match the new one, and now everything works perfectly. I suspect that anybody 
who made a project recently might find themselves in the same situation.

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16290846#comment-16290846
 ] 

Ryan Brideau commented on FLINK-8256:
-

Thanks for looking into this so quickly. I managed to track down the root of 
the issue on my end. I had built my project previously using the snapshot 
archetype, and not the newest 1.4.0 one:

{code:java}
mvn archetype:generate   \
  -DarchetypeGroupId=org.apache.flink  \
  -DarchetypeArtifactId=flink-quickstart-scala \
  -DarchetypeVersion=1.4-SNAPSHOT
{code}

To fix the problem I just build a new empty project using the 1.4.0 archetype 
version and did a diff of the pom.xml of the two, updating my existing one to 
match the new one, and now everything works perfectly. I suspect that anybody 
who made a project recently might find themselves in the same situation.

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
>  

[jira] [Created] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-13 Thread Ryan Brideau (JIRA)
Ryan Brideau created FLINK-8256:
---

 Summary: Cannot use Scala functions to filter in 1.4 - 
java.lang.ClassCastException
 Key: FLINK-8256
 URL: https://issues.apache.org/jira/browse/FLINK-8256
 Project: Flink
  Issue Type: Bug
  Components: DataStream API
Affects Versions: 1.4.0
 Environment: macOS, Local Flink v1.4.0, Scala 2.11
Reporter: Ryan Brideau


I built the newest release locally today, but when I try to filter a stream 
using an anonymous or named function, I get an error. Here's a simple example:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object TestFunction {

  def main(args: Array[String]): Unit = {

val env = StreamExecutionEnvironment.getExecutionEnvironment
val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)

val someArray = Array(1,2,3)
val stream = env.fromCollection(someArray).filter(_ => true)
stream.print().setParallelism(1)
env.execute("Testing Function")
  }
}
{code}

This results in:


{code:java}
Job execution switched to status FAILING.
org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
instantiate user function.
at 
org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassCastException: cannot assign instance of 
org.peopleinmotion.TestFunction$$anonfun$1 to field 
org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
scala.Function1 in instance of 
org.apache.flink.streaming.api.scala.DataStream$$anon$7
at 
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
at 
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
at 
org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
at 
org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
at 
org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
... 6 more
12/13/2017 15:10:01 Job execution switched to status FAILED.


 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The program 
execution failed: Job execution failed.
at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
at 
org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
at 
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
at 
org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
at 
org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
at org.peopleinmotion.TestFunction.main(TestFunction.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525)
at 
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:41

[jira] [Closed] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-12-13 Thread Ryan Brideau (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-8103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Brideau closed FLINK-8103.
---
Resolution: Fixed

> Flink 1.4 not writing to standard out log file
> --
>
> Key: FLINK-8103
> URL: https://issues.apache.org/jira/browse/FLINK-8103
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.4.0
> Environment: macOS 10.13 (High Sierra)
>Reporter: Ryan Brideau
>
> I built the latest snapshot of 1.4 yesterday and tried testing it with a 
> simple word count example, where StreamUtil is just a helper that checks 
> input parameters:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream.println
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> This runs without an issue on the latest stable version of 1.3 and writes its 
> results to the _out_ file, which I can tail to see the results. This doesn't 
> happen in 1.4, however. I can modify it to write out to a file, however:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.core.fs.FileSystem.WriteMode
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream
>   .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
>   .setParallelism(1)
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> Any clues as to what might be causing this?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-12-13 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8103?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16289607#comment-16289607
 ] 

Ryan Brideau commented on FLINK-8103:
-

Apologies for the delay. Just pulled the latest updates, re-built, and tried 
again and it works as expected now. Thank you!

> Flink 1.4 not writing to standard out log file
> --
>
> Key: FLINK-8103
> URL: https://issues.apache.org/jira/browse/FLINK-8103
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.4.0
> Environment: macOS 10.13 (High Sierra)
>Reporter: Ryan Brideau
>
> I built the latest snapshot of 1.4 yesterday and tried testing it with a 
> simple word count example, where StreamUtil is just a helper that checks 
> input parameters:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream.println
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> This runs without an issue on the latest stable version of 1.3 and writes its 
> results to the _out_ file, which I can tail to see the results. This doesn't 
> happen in 1.4, however. I can modify it to write out to a file, however:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.core.fs.FileSystem.WriteMode
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream
>   .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
>   .setParallelism(1)
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> Any clues as to what might be causing this?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-11-18 Thread Ryan Brideau (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-8103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Brideau updated FLINK-8103:

Description: 
I built the latest snapshot of 1.4 yesterday and tried testing it with a simple 
word count example, where StreamUtil is just a helper that checks input 
parameters:

{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?

  was:
I built the latest snapshot of 1.4 yesterday and tried testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?


> Flink 1.4 not writing to standard out log file
> --
>
> Key: FLINK-8103
> URL: https://issues.apache.org/jira/browse/FLINK-8103
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.4.0
> Environment: macOS 10.13 (High Sierra)
>Reporter: Ryan Brideau
>
> I built the latest snapshot of 1.4 yesterday and tried testing it with a 
> simple word count example, where StreamUtil is just a helper that checks 
> input parameters:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream.println
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> This runs without an issue on the latest stable version of 1.3 and writes its

[jira] [Updated] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-11-18 Thread Ryan Brideau (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-8103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Brideau updated FLINK-8103:

Description: 
I built the latest snapshot of 1.4 yesterday and tried testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?

  was:
I built the latest snapshot of 1.4 yesterday and tries testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?


> Flink 1.4 not writing to standard out log file
> --
>
> Key: FLINK-8103
> URL: https://issues.apache.org/jira/browse/FLINK-8103
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.4.0
> Environment: macOS 10.13 (High Sierra)
>Reporter: Ryan Brideau
>
> I built the latest snapshot of 1.4 yesterday and tried testing it with a 
> simple word count example, where StreamUtil is just a helper than checks 
> input parameters:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream.println
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> This runs without an issue on the latest stable version of 1.3 and writes its

[jira] [Created] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-11-18 Thread Ryan Brideau (JIRA)
Ryan Brideau created FLINK-8103:
---

 Summary: Flink 1.4 not writing to standard out log file
 Key: FLINK-8103
 URL: https://issues.apache.org/jira/browse/FLINK-8103
 Project: Flink
  Issue Type: Bug
  Components: Core
Affects Versions: 1.4.0
 Environment: macOS 10.13 (High Sierra)
Reporter: Ryan Brideau


I built the latest snapshot of 1.4 yesterday and tries testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:scala}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:scala}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (FLINK-8103) Flink 1.4 not writing to standard out log file

2017-11-18 Thread Ryan Brideau (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-8103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ryan Brideau updated FLINK-8103:

Description: 
I built the latest snapshot of 1.4 yesterday and tries testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:java}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?

  was:
I built the latest snapshot of 1.4 yesterday and tries testing it with a simple 
word count example, where StreamUtil is just a helper than checks input 
parameters:

{code:scala}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream.println

// execute program
env.execute("Words Scala")
  }
}
{code}

This runs without an issue on the latest stable version of 1.3 and writes its 
results to the _out_ file, which I can tail to see the results. This doesn't 
happen in 1.4, however. I can modify it to write out to a file, however:


{code:scala}
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.core.fs.FileSystem.WriteMode
import org.apache.flink.streaming.api.scala._

object Words {
  def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment

val params = ParameterTool.fromArgs(args)
env.getConfig.setGlobalJobParameters(params)
val dataStream = StreamUtil.getDataStream(env, params)

val wordDataStream = dataStream
  .flatMap{ _.split(" ") }

wordDataStream
  .writeAsText("file:///somepath/output", WriteMode.OVERWRITE)
  .setParallelism(1)

// execute program
env.execute("Words Scala")
  }
}
{code}

Any clues as to what might be causing this?


> Flink 1.4 not writing to standard out log file
> --
>
> Key: FLINK-8103
> URL: https://issues.apache.org/jira/browse/FLINK-8103
> Project: Flink
>  Issue Type: Bug
>  Components: Core
>Affects Versions: 1.4.0
> Environment: macOS 10.13 (High Sierra)
>Reporter: Ryan Brideau
>
> I built the latest snapshot of 1.4 yesterday and tries testing it with a 
> simple word count example, where StreamUtil is just a helper than checks 
> input parameters:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object Words {
>   def main(args: Array[String]) {
> // set up the execution environment
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val dataStream = StreamUtil.getDataStream(env, params)
> val wordDataStream = dataStream
>   .flatMap{ _.split(" ") }
> wordDataStream.println
> // execute program
> env.execute("Words Scala")
>   }
> }
> {code}
> This runs without an issue on the latest stable version of 1.3 and writes i