Re: Fwd: Executor Lost Failure

2014-11-11 Thread Ritesh Kumar Singh
Yes... found the output on web UI of the slave.

Thanks :)

On Tue, Nov 11, 2014 at 2:48 AM, Ankur Dave ankurd...@gmail.com wrote:

 At 2014-11-10 22:53:49 +0530, Ritesh Kumar Singh 
 riteshoneinamill...@gmail.com wrote:
  Tasks are now getting submitted, but many tasks don't happen.
  Like, after opening the spark-shell, I load a text file from disk and try
  printing its contentsas:
 
 sc.textFile(/path/to/file).foreach(println)
 
  It does not give me any output.

 That's because foreach launches tasks on the slaves. When each task tries
 to print its lines, they go to the stdout file on the slave rather than to
 your console at the driver. You should see the file's contents in each of
 the slaves' stdout files in the web UI.

 This only happens when running on a cluster. In local mode, all the tasks
 are running locally and can output to the driver, so foreach(println) is
 more useful.

 Ankur



Re: Fwd: Executor Lost Failure

2014-11-10 Thread Ankur Dave
At 2014-11-10 22:53:49 +0530, Ritesh Kumar Singh 
riteshoneinamill...@gmail.com wrote:
 Tasks are now getting submitted, but many tasks don't happen.
 Like, after opening the spark-shell, I load a text file from disk and try
 printing its contentsas:

sc.textFile(/path/to/file).foreach(println)

 It does not give me any output.

That's because foreach launches tasks on the slaves. When each task tries to 
print its lines, they go to the stdout file on the slave rather than to your 
console at the driver. You should see the file's contents in each of the 
slaves' stdout files in the web UI.

This only happens when running on a cluster. In local mode, all the tasks are 
running locally and can output to the driver, so foreach(println) is more 
useful.

Ankur

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org