No it is the expected behaviour. As I've said, you should give the
createRemoteEnvironment the user code jar of your program. Otherwise Flink
cannot find your filter function. Hence, it works if you comment it out
because it is not needed.
Cheers,
Till
On Thu, Oct 31, 2019 at 11:41 AM Simon Su
Hi Till
Thanks for your reply. Actually I modify the code like this:
I commented the filter part, and re-run the code, then it works well !! The
jar passed to createRemoteEnvironment is a udf jar, which does not contain my
code
My flink version is 1.9.0, So I’m confused about the actual
In order to run the program on a remote cluster from the IDE you need to
first build the jar containing your user code. This jar needs to passed
to createRemoteEnvironment() so that the Flink client knows which jar to
upload. Hence, please make sure that /tmp/myudf.jar contains your user code.
Hi all
I want to test to submit a job from my local IDE and I deployed a Flink
cluster in my vm.
Here is my code from Flink 1.9 document and add some of my parameters.
public static void main(String[] args) throws Exception {
ExecutionEnvironment env = ExecutionEnvironment