Hi All,

I am submitting few JOBS remotely using spark on YARN /SPARK standalone.
Jobs get submitted and run successfully, but all of sudden it gets throwing 
exception for days on same cluster:

StackTrace:


Set(); users  with modify permissions: Set(hadoop); groups with modify 
permissions: Set()

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. 
Attempting port 1.

16/12/21 12:38:30 ERROR SparkContext: Error initializing SparkContext.

java.net.BindException: Cannot assign requested address: Service 'sparkDriver' 
failed after 16 retries! Consider explicitly setting the appropriate port for 
the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an 
available port or increasing spark.port.maxRetries.

        at sun.nio.ch.Net.bind0(Native Method)

        at sun.nio.ch.Net.bind(Net.java:437)

        at sun.nio.ch.Net.bind(Net.java:429)


Cant find anything relevant on stack overflow, tried hostname mapping in 
etc/host, but no help. Once it starts coming, job never runs. Found many open 
questions also but no concrete solution.

HELP!!!!!!!!!!!!

Manisha

________________________________

Reply via email to