Hi,

You can have a custom properties file with Map like entries Key, Value
pairs "URL"-> "IPaddress:port/user/...." etc and put this file on HDFS or
any location where Spark can access. Read the file as RDD as Map and read
the values in program.
You can also broadcast this in program if you need to access in all worker
nodes.

Regards,
Haroon Syed.

On 5 April 2016 at 07:15, yaoxiaohua <yaoxiao...@outlook.com> wrote:

> Hi bro,
>
>                 I am new in spark application develop, I need develop two
> app running on spark cluster.
>
>                 Now I have some arguments for the application.
>
>                 I can pass them as the program arguments when
> spark-submit, I want to find a new way.
>
>                 I have some arguments such as ,jdbc url, elastic search
> nodes , kafka group id,
>
>                 So I want to know whether there is a best practices to do
> this.
>
>                 Can I read this from one custom properties file?
>
>
>
>                 Another similar question is that, I will read some rules
> to analysis data,
>
>                 Now I store the rules in mysql,  before I use it ,I read
> it from mysql.
>
>                 Is there better way to do this?
>
>                 Thanks in advance.
>
>
>
> Best Regards,
>
> Evan Yao
>

Reply via email to