Hi bro,

                I am new in spark application develop, I need develop two
app running on spark cluster.

                Now I have some arguments for the application.

                I can pass them as the program arguments when spark-submit,
I want to find a new way.

                I have some arguments such as ,jdbc url, elastic search
nodes , kafka group id,

                So I want to know whether there is a best practices to do
this. 

                Can I read this from one custom properties file?

                

                Another similar question is that, I will read some rules to
analysis data,

                Now I store the rules in mysql,  before I use it ,I read it
from mysql.

                Is there better way to do this?

                Thanks in advance.

 

Best Regards,

Evan Yao

Reply via email to