When running
val data = sc.wholeTextFile("someDir/*") data.count()

I get numerous warning from yarn till I get aka association exception.
Can someone explain what happen when spark loads this rdd and can't fit it
all in memory?
Based on the exception it looks like the server is disconnecting from yarn
and failing... Any idea why? The code is simple but still failing...
Eran

Reply via email to