OK it turned up pretty simple once I established where to read the conf
file in yarn mode and get the values for different keys
val dbHost = conf.getString("dbHost")
val dbPort = conf.getString("dbPort")
val dbConnection = conf.getString("dbConnection")
val namespace =
Many thanks Chris.
In my Spark streaming I would like to use the config file to read the
parameters in. Taking your example, I have
val globalConfig = ConfigFactory.load()
val conf = globalConfig.getConfig(sparkAppName) // extract out
top level key from top level namespace
I try "yarn logs -applicationId application_1564655863362_19006" but get
"/tmp/logs/chenzl/logs/application_1564655863362_19006 does not have any
log files".
Does spark only get output or logs in kill app running in cluster mode?