Hi Tison,

I think I may have found what I want in example 22.
https://www.programcreek.com/java-api-examples/?api=org.apache.flink.configuration.Configuration

I need to create Configuration object first as shown .

Also I think  flink-conf.yaml file may contain configuration for client
rather than  server. So before starting is irrelevant.
I am going to play around and see but if the Configuration class allows me
to set configuration programmatically and overrides the yaml file then that
would be great.



On Sun, 19 Apr 2020, 11:35 Som Lima, <somplastic...@gmail.com> wrote:

> Thanks.
> flink-conf.yaml does allow me to do what I need to do without making any
> changes to client source code.
>
> But
> RemoteStreamEnvironment constructor  expects a jar file as the third
> parameter also.
>
> RemoteStreamEnvironment
> <https://ci.apache.org/projects/flink/flink-docs-release-1.7/api/java/org/apache/flink/streaming/api/environment/RemoteStreamEnvironment.html#RemoteStreamEnvironment-java.lang.String-int-java.lang.String...->
> (String
> <http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true>
>  host,
> int port, String
> <http://docs.oracle.com/javase/7/docs/api/java/lang/String.html?is-external=true>
> ... jarFiles)
> Creates a new RemoteStreamEnvironment that points to the master
> (JobManager) described by the given host name and port.
>
> On Sun, 19 Apr 2020, 11:02 tison, <wander4...@gmail.com> wrote:
>
>> You can change flink-conf.yaml "jobmanager.address" or "jobmanager.port"
>> options before run the program or take a look at RemoteStreamEnvironment
>> which enables configuring host and port.
>>
>> Best,
>> tison.
>>
>>
>> Som Lima <somplastic...@gmail.com> 于2020年4月19日周日 下午5:58写道:
>>
>>> Hi,
>>>
>>> After running
>>>
>>> $ ./bin/start-cluster.sh
>>>
>>> The following line of code defaults jobmanager  to localhost:6123
>>>
>>> final  ExecutionEnvironment env = Environment.getExecutionEnvironment();
>>>
>>> which is same on spark.
>>>
>>> val spark =
>>> SparkSession.builder.master(local[*]).appname("anapp").getOrCreate
>>>
>>> However if I wish to run the servers on a different physical computer.
>>> Then in Spark I can do it this way using the spark URI in my IDE.
>>>
>>> Conf =
>>> SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")
>>>
>>> Can you please tell me the equivalent change to make so I can run my
>>> servers and my IDE from different physical computers.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>

Reply via email to