The following code snippet should just work:

```
from pyflink.datastream import StreamExecutionEnvironment
env = StreamExecutionEnvironment.get_execution_environment()
```

It works both in local deployment and in flink clusters.

You could refer to [1] on how to submit PyFlink jobs to a remote cluster.

[1]
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/deployment/cli/#submitting-pyflink-jobs

Regards,
Dian


On Mon, Feb 7, 2022 at 12:09 AM nagi data monkey <nagidatamon...@gmail.com>
wrote:

> Hi all,
>
> Anyone got a pyflink datastream job working? I think I'm having difficulty
> seeing a small flunk cluster I've set up in docker. I can't see any way
> that pyflink can pick up a Remote Execution Enviroment. This is the only
> 'compiling' code snippet I can find:
>
> from pyflink.datastream import StreamExecutionEnvironment
> env = StreamExecutionEnvironment.get_execution_environment()
>
> which at least allows me to run pyflink code, but not see any flink
> clusters etc. Any ideas how I'm meant to actually get a pyflink job running?
>
> TIA
>

Reply via email to