Hi Ron,
I just checked and this bug is fixed in recent releases of Spark.
-Sandy
On Sun, Jul 13, 2014 at 8:15 PM, Chester Chen ches...@alpinenow.com wrote:
Ron,
Which distribution and Version of Hadoop are you using ?
I just looked at CDH5 ( hadoop-mapreduce-client-core-
Hi,
I was doing programmatic submission of Spark yarn jobs and I saw code in
ClientBase.getDefaultYarnApplicationClasspath():
val field = classOf[MRJobConfig].getField(DEFAULT_YARN_APPLICATION_CLASSPATH)
MRJobConfig doesn't have this field so the created launch env is incomplete.
Workaround
On Sun, Jul 13, 2014 at 9:49 PM, Ron Gonzalez zlgonza...@yahoo.com wrote:
I can easily fix this by changing this to YarnConfiguration instead of
MRJobConfig but was wondering what the steps are for submitting a fix.
Relevant links:
-
Ron,
Which distribution and Version of Hadoop are you using ?
I just looked at CDH5 ( hadoop-mapreduce-client-core-
2.3.0-cdh5.0.0),
MRJobConfig does have the field :
java.lang.String DEFAULT_MAPREDUCE_APPLICATION_CLASSPATH;
Chester
On Sun, Jul 13, 2014 at 6:49 PM, Ron Gonzalez