|
Hi, I'm trying to compile spark (both 0.8 and master) against CDH-4.4.0 with YARN Unfortunately it fails because of an API change introduced between CDH-4.3.0 and CDH-4.4.0 The API has changed since hadoop 2.1.0-beta The AllocateResponse now directly expose a getAllocatedContainers() method http://hadoop.apache.org/docs/r2.2.0/api/org/apache/hadoop/yarn/api/protocolrecords/AllocateResponse.html Same thing for a few other methods used later in the code So one should just change amResp (for instance one is line 86 in : https://github.com/apache/incubator-spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala#L86 ) // Keep polling the Resource
Manager for containers
val amResp = allocateWorkerResources(workersToRequest).getAMResponse
To val amResp = allocateWorkerResources(workersToRequest) Tried with master, it works (succesfully launched a SparkPi job on 4 nodes) Guillaume --
|
- Current master doesn't compile against CDH 4.4.0 Guillaume Pitel
- Re: Current master doesn't compile against CDH 4.4.0 Matei Zaharia

