Running a Scala Job doesn't produce print output

2018-06-21 Thread Mano Swerts
Hi guys,

I have a question. I have been playing around with Fink this week and created 
some basic Java jobs that work fine. Now I am trying to run one in Scala.

Running this code in the Scala REP prints the expected output:

env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()

However, having it packaged in a JAR which I then deploy through the user 
interface doesn’t give me any output at all. I can start the job and it 
finishes without exceptions, but I don’t see the result of the print() 
statement in the log. The class looks like this:


package com.ixxus.playground.fmk.flink

import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.api.scala._

object LearnDocumentEntityRelationship {

def main(args: Array[String]) {
val env = ExecutionEnvironment.getExecutionEnvironment
val params: ParameterTool = ParameterTool.fromArgs(args)

env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()

env.execute("Scala example")
}
}


I did notice that the job name isn’t what I pass to env.execute. It is named 
“Flink Java Job”:

[cid:2702EFEA-4621-48AD-8259-8671011EB519]


I can’t find anything online however about this phenomenon. Does anyone have 
any idea?

Thanks.

— Mano


Re: Running a Scala Job doesn't produce print output

2018-06-21 Thread Till Rohrmann
Please verify that you have set the correct main class manifest in you
pom.xml when you build the user code jar. Alternatively you can specify the
class to execute via `bin/flink run -c CLASS_TO_EXECUTE` if your user code
jar contains multiple classes.

Cheers,
Till

On Thu, Jun 21, 2018 at 11:28 AM Mano Swerts  wrote:

> Hi guys,
>
> I have a question. I have been playing around with Fink this week and
> created some basic Java jobs that work fine. Now I am trying to run one in
> Scala.
>
> Running this code in the Scala REP prints the expected output:
>
> *env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()*
>
> However, having it packaged in a JAR which I then deploy through the user
> interface doesn’t give me any output at all. I can start the job and it
> finishes without exceptions, but I don’t see the result of the print()
> statement in the log. The class looks like this:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *package com.ixxus.playground.fmk.flink
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.api.scala._ object LearnDocumentEntityRelationship
> { def main(args: Array[String]) { val env =
> ExecutionEnvironment.getExecutionEnvironment val params:
> ParameterTool = ParameterTool.fromArgs(args)
> env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()
> env.execute("Scala example") } }*
>
>
> I did notice that the job name isn’t what I pass to env.execute. It is
> named “Flink Java Job”:
>
>
>
> I can’t find anything online however about this phenomenon. Does anyone
> have any idea?
>
> Thanks.
>
> — Mano
>


Re: Running a Scala Job doesn't produce print output

2018-06-21 Thread Mano Swerts
Hi guys,

I am going to answer my own question ;) I looked at a Scala example in the 
Flink Github repo, which uses ExecutionEnvironment.getExecutionEnvironment to 
obtain the environment. That apparently doesn’t work.

When I change this to StreamExecutionEnvironment.getExecutionEnvironment, as 
used in the Flink Maven archetype, it works fine.

I don’t know whether this is a bug or the example needs updating. At least now 
this has been recorded for others struggling with the same issue in the future.

— Mano

On 21 Jun 2018, at 11:27, Mano Swerts 
mailto:mano.swe...@ixxus.com>> wrote:

Hi guys,

I have a question. I have been playing around with Fink this week and created 
some basic Java jobs that work fine. Now I am trying to run one in Scala.

Running this code in the Scala REP prints the expected output:

env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()

However, having it packaged in a JAR which I then deploy through the user 
interface doesn’t give me any output at all. I can start the job and it 
finishes without exceptions, but I don’t see the result of the print() 
statement in the log. The class looks like this:


package com.ixxus.playground.fmk.flink

import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.api.scala._

object LearnDocumentEntityRelationship {

def main(args: Array[String]) {
val env = ExecutionEnvironment.getExecutionEnvironment
val params: ParameterTool = ParameterTool.fromArgs(args)

env.fromElements(1, 2, 3).map(i => " Integer: " + i).print()

env.execute("Scala example")
}
}


I did notice that the job name isn’t what I pass to env.execute. It is named 
“Flink Java Job”:




I can’t find anything online however about this phenomenon. Does anyone have 
any idea?

Thanks.

— Mano