Hi guys,

I am going to answer my own question ;) I looked at a Scala example in the 
Flink Github repo, which uses ExecutionEnvironment.getExecutionEnvironment to 
obtain the environment. That apparently doesn’t work.

When I change this to StreamExecutionEnvironment.getExecutionEnvironment, as 
used in the Flink Maven archetype, it works fine.

I don’t know whether this is a bug or the example needs updating. At least now 
this has been recorded for others struggling with the same issue in the future.

— Mano

On 21 Jun 2018, at 11:27, Mano Swerts 
<mano.swe...@ixxus.com<mailto:mano.swe...@ixxus.com>> wrote:

Hi guys,

I have a question. I have been playing around with Fink this week and created 
some basic Java jobs that work fine. Now I am trying to run one in Scala.

Running this code in the Scala REP prints the expected output:

env.fromElements(1, 2, 3).map(i => "==== Integer: " + i).print()

However, having it packaged in a JAR which I then deploy through the user 
interface doesn’t give me any output at all. I can start the job and it 
finishes without exceptions, but I don’t see the result of the print() 
statement in the log. The class looks like this:


package com.ixxus.playground.fmk.flink

import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.api.scala._

object LearnDocumentEntityRelationship {

    def main(args: Array[String]) {
        val env = ExecutionEnvironment.getExecutionEnvironment
        val params: ParameterTool = ParameterTool.fromArgs(args)

        env.fromElements(1, 2, 3).map(i => "==== Integer: " + i).print()

        env.execute("Scala example")
    }
}


I did notice that the job name isn’t what I pass to env.execute. It is named 
“Flink Java Job”:

<Scala job.png>


I can’t find anything online however about this phenomenon. Does anyone have 
any idea?

Thanks.

— Mano

Reply via email to