Re: Spark Error Log

2013-12-06 Thread Wenlei Xie
Hi Prashant, Thank you! The reason I would like to do it is that currently my program's output is set to stdout, and it would be mixed with Spark's log. That's not a big issue anyway, since I can either disable log or put some prefix before my info :) Best, Wenlei On Sun, Dec 1, 2013 at 2:49 AM

Re: Spark Error Log

2013-12-01 Thread Prashant Sharma
Hi, I am not sure I know how to. Above should have worked. Apart from the trick every one knows that you can redirect stdout to stderr, knowing why do you need it would be great ! On Sat, Nov 30, 2013 at 2:53 PM, Wenlei Xie wrote: > Hi Prashant, > > I copied the log4j.properites.template to be

Re: Spark Error Log

2013-11-30 Thread Wenlei Xie
Hi Prashant, I copied the log4j.properites.template to be log4j.preperties, but now all the information is in stdout rather than stderr. How should I make it to output to the stderr? I have tried to change log4j.properties to be log4j.rootCategory=INFO, stderr log4j.appender.stderr=org.apache.lo

Re: Spark Error Log

2013-11-28 Thread Wenlei Xie
Hi Patrick, I am running spark using ./run-example script. More specifically, I use the standalone Spark sever (e.g. the spark server would be like spark://xxx:7077). I am using the Spark from the GraphX branch, it might not be the same with the master branch :). I would take a look into the log

Re: Spark Error Log

2013-11-28 Thread Patrick Wendell
The issue is I think we changed something where it went from having good default behavior if you don't include a log4j.properties file to having an error message. I think it depends how the user is running Spark though, so I wanted to get specifics. On Thu, Nov 28, 2013 at 10:39 AM, Prashant Sharm

Re: Spark Error Log

2013-11-28 Thread Prashant Sharma
I think all that is needed is an log4j.properties on the classpath http://logging.apache.org/log4j/1.2/faq.html#noconfig On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell wrote: > Hey Wenlei, > > There is some issue in master that is repressing the log output - I'm > trying to debug it before we

Re: Spark Error Log

2013-11-28 Thread Patrick Wendell
Hey Wenlei, There is some issue in master that is repressing the log output - I'm trying to debug it before we release 0.8.1. Can you explain exactly how you are running Spark? Are you running the shell or are you running a standalone application? - Patrick On Thu, Nov 28, 2013 at 12:54 AM, Wenl

Spark Error Log

2013-11-28 Thread Wenlei Xie
Hi, I remember Spark used to print detailed error log into the stderr (e.g. constructing RDD, evaluate it, how much memory each partition consumes). But I cannot find it anymore but only with the following information: SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [j