issue while running the code in standalone mode: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

2014-11-24 Thread vdiwakar.malladi
Hi, When i trying to execute the program from my laptop by connecting to HDP environment (on which Spark also configured), i'm getting the warning (Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory) and Job is being

Re: issue while running the code in standalone mode: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

2014-11-24 Thread Akhil Das
This can happen mainly because of the following: - Wrong master url (Make sure you give the master url which is listed on top left corner of the webui - running on 8080) - Allocated more memory/cores while creating the sparkContext. Thanks Best Regards On Mon, Nov 24, 2014 at 4:13 PM,

Re: issue while running the code in standalone mode: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

2014-11-24 Thread Sean Owen
Wouldn't it likely be the opposite? Too much memory / too many cores being requested relative to the resource that YARN makes available? On Nov 24, 2014 11:00 AM, Akhil Das ak...@sigmoidanalytics.com wrote: This can happen mainly because of the following: - Wrong master url (Make sure you give

Re: issue while running the code in standalone mode: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

2014-11-24 Thread vdiwakar.malladi
Thanks for your response. I gave correct master url. Moreover as i mentioned in my post, i could able to run the sample program by using spark-submit. But it is not working when i'm running from my machine. Any clue on this? Thanks in advance. -- View this message in context: