Re: Spark Error Log

2013-12-06 Thread Wenlei Xie
Hi Prashant,

Thank you! The reason I would like to do it is that currently my program's
output is set to stdout, and it would be mixed with Spark's log.
That's not a big issue anyway, since I can either disable log or put some
prefix before my info :)

Best,
Wenlei


On Sun, Dec 1, 2013 at 2:49 AM, Prashant Sharma wrote:

> Hi,
>
> I am not sure I know how to. Above should have worked. Apart from the
> trick every one knows that you can redirect stdout to stderr, knowing why
> do you need it would be great !
>
>
> On Sat, Nov 30, 2013 at 2:53 PM, Wenlei Xie  wrote:
>
>> Hi Prashant,
>>
>> I copied the log4j.properites.template to be log4j.preperties, but now
>> all the information is in stdout rather than stderr.
>>
>> How should I make it to output to the stderr? I have tried to change
>> log4j.properties to be
>>
>> log4j.rootCategory=INFO, stderr
>> log4j.appender.stderr=org.apache.log4j.ConsoleAppender
>> log4j.appender.stderr.layout=org.apache.log4j.PatternLayout
>> log4j.appender.stderr.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
>> %c{1}: %m%n
>>
>> But it still prints to the stdout...
>>
>>
>> On Thu, Nov 28, 2013 at 10:39 AM, Prashant Sharma 
>> wrote:
>>
>>> I think all that is needed is an log4j.properties on the classpath
>>> http://logging.apache.org/log4j/1.2/faq.html#noconfig
>>>
>>>
>>> On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell wrote:
>>>
 Hey Wenlei,

 There is some issue in master that is repressing the log output - I'm
 trying to debug it before we release 0.8.1. Can you explain exactly
 how you are running Spark? Are you running the shell or are you
 running a standalone application?

 - Patrick

 On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie 
 wrote:
 > Hi,
 >
 > I remember Spark used to print detailed error log into the stderr
 (e.g.
 > constructing RDD, evaluate it, how much memory each partition
 consumes). But
 > I cannot find it anymore but only with the following information:
 >
 > SLF4J: Class path contains multiple SLF4J bindings.
 > SLF4J: Found binding in
 >
 [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 > SLF4J: Found binding in
 >
 [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
 > explanation.
 > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 > log4j:WARN No appenders could be found for logger
 > (akka.event.slf4j.Slf4jEventHandler).
 > log4j:WARN Please initialize the log4j system properly.
 > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfigfor
 > more info.
 >
 >
 > What should I do for it?
 >
 > Thanks,
 >
 > Wenlei

>>>
>>>
>>>
>>> --
>>> s
>>>
>>
>>
>>
>> --
>> Wenlei Xie (谢文磊)
>>
>> Department of Computer Science
>> 5132 Upson Hall, Cornell University
>> Ithaca, NY 14853, USA
>> Phone: (607) 255-5577
>> Email: wenlei@gmail.com
>>
>
>
>
> --
> s
>



-- 
Wenlei Xie (谢文磊)

Department of Computer Science
5132 Upson Hall, Cornell University
Ithaca, NY 14853, USA
Phone: (607) 255-5577
Email: wenlei@gmail.com


Re: Spark Error Log

2013-12-01 Thread Prashant Sharma
Hi,

I am not sure I know how to. Above should have worked. Apart from the trick
every one knows that you can redirect stdout to stderr, knowing why do you
need it would be great !


On Sat, Nov 30, 2013 at 2:53 PM, Wenlei Xie  wrote:

> Hi Prashant,
>
> I copied the log4j.properites.template to be log4j.preperties, but now all
> the information is in stdout rather than stderr.
>
> How should I make it to output to the stderr? I have tried to change
> log4j.properties to be
>
> log4j.rootCategory=INFO, stderr
> log4j.appender.stderr=org.apache.log4j.ConsoleAppender
> log4j.appender.stderr.layout=org.apache.log4j.PatternLayout
> log4j.appender.stderr.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
> %c{1}: %m%n
>
> But it still prints to the stdout...
>
>
> On Thu, Nov 28, 2013 at 10:39 AM, Prashant Sharma wrote:
>
>> I think all that is needed is an log4j.properties on the classpath
>> http://logging.apache.org/log4j/1.2/faq.html#noconfig
>>
>>
>> On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell wrote:
>>
>>> Hey Wenlei,
>>>
>>> There is some issue in master that is repressing the log output - I'm
>>> trying to debug it before we release 0.8.1. Can you explain exactly
>>> how you are running Spark? Are you running the shell or are you
>>> running a standalone application?
>>>
>>> - Patrick
>>>
>>> On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie 
>>> wrote:
>>> > Hi,
>>> >
>>> > I remember Spark used to print detailed error log into the stderr (e.g.
>>> > constructing RDD, evaluate it, how much memory each partition
>>> consumes). But
>>> > I cannot find it anymore but only with the following information:
>>> >
>>> > SLF4J: Class path contains multiple SLF4J bindings.
>>> > SLF4J: Found binding in
>>> >
>>> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> > SLF4J: Found binding in
>>> >
>>> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> > explanation.
>>> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> > log4j:WARN No appenders could be found for logger
>>> > (akka.event.slf4j.Slf4jEventHandler).
>>> > log4j:WARN Please initialize the log4j system properly.
>>> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfigfor
>>> > more info.
>>> >
>>> >
>>> > What should I do for it?
>>> >
>>> > Thanks,
>>> >
>>> > Wenlei
>>>
>>
>>
>>
>> --
>> s
>>
>
>
>
> --
> Wenlei Xie (谢文磊)
>
> Department of Computer Science
> 5132 Upson Hall, Cornell University
> Ithaca, NY 14853, USA
> Phone: (607) 255-5577
> Email: wenlei@gmail.com
>



-- 
s


Re: Spark Error Log

2013-11-30 Thread Wenlei Xie
Hi Prashant,

I copied the log4j.properites.template to be log4j.preperties, but now all
the information is in stdout rather than stderr.

How should I make it to output to the stderr? I have tried to change
log4j.properties to be

log4j.rootCategory=INFO, stderr
log4j.appender.stderr=org.apache.log4j.ConsoleAppender
log4j.appender.stderr.layout=org.apache.log4j.PatternLayout
log4j.appender.stderr.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
%c{1}: %m%n

But it still prints to the stdout...


On Thu, Nov 28, 2013 at 10:39 AM, Prashant Sharma wrote:

> I think all that is needed is an log4j.properties on the classpath
> http://logging.apache.org/log4j/1.2/faq.html#noconfig
>
>
> On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell wrote:
>
>> Hey Wenlei,
>>
>> There is some issue in master that is repressing the log output - I'm
>> trying to debug it before we release 0.8.1. Can you explain exactly
>> how you are running Spark? Are you running the shell or are you
>> running a standalone application?
>>
>> - Patrick
>>
>> On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie 
>> wrote:
>> > Hi,
>> >
>> > I remember Spark used to print detailed error log into the stderr (e.g.
>> > constructing RDD, evaluate it, how much memory each partition
>> consumes). But
>> > I cannot find it anymore but only with the following information:
>> >
>> > SLF4J: Class path contains multiple SLF4J bindings.
>> > SLF4J: Found binding in
>> >
>> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> > SLF4J: Found binding in
>> >
>> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> > explanation.
>> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> > log4j:WARN No appenders could be found for logger
>> > (akka.event.slf4j.Slf4jEventHandler).
>> > log4j:WARN Please initialize the log4j system properly.
>> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfigfor
>> > more info.
>> >
>> >
>> > What should I do for it?
>> >
>> > Thanks,
>> >
>> > Wenlei
>>
>
>
>
> --
> s
>



-- 
Wenlei Xie (谢文磊)

Department of Computer Science
5132 Upson Hall, Cornell University
Ithaca, NY 14853, USA
Phone: (607) 255-5577
Email: wenlei@gmail.com


Re: Spark Error Log

2013-11-28 Thread Wenlei Xie
Hi Patrick,

I am running spark using ./run-example script. More specifically, I use the
standalone Spark sever (e.g. the spark server would be like
spark://xxx:7077).

I am using the Spark from the GraphX branch, it might not be the same with
the master branch :).

I would take a look into the log4j.properties as Prashant suggests, thank
you!

Best,
Wenlei


On Thu, Nov 28, 2013 at 10:22 AM, Patrick Wendell wrote:

> Hey Wenlei,
>
> There is some issue in master that is repressing the log output - I'm
> trying to debug it before we release 0.8.1. Can you explain exactly
> how you are running Spark? Are you running the shell or are you
> running a standalone application?
>
> - Patrick
>
> On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie  wrote:
> > Hi,
> >
> > I remember Spark used to print detailed error log into the stderr (e.g.
> > constructing RDD, evaluate it, how much memory each partition consumes).
> But
> > I cannot find it anymore but only with the following information:
> >
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> >
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> >
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > log4j:WARN No appenders could be found for logger
> > (akka.event.slf4j.Slf4jEventHandler).
> > log4j:WARN Please initialize the log4j system properly.
> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> > more info.
> >
> >
> > What should I do for it?
> >
> > Thanks,
> >
> > Wenlei
>



-- 
Wenlei Xie (谢文磊)

Department of Computer Science
5132 Upson Hall, Cornell University
Ithaca, NY 14853, USA
Phone: (607) 255-5577
Email: wenlei@gmail.com


Re: Spark Error Log

2013-11-28 Thread Patrick Wendell
The issue is I think we changed something where it went from having
good default behavior if you don't include a log4j.properties file to
having an error message. I think it depends how the user is running
Spark though, so I wanted to get specifics.

On Thu, Nov 28, 2013 at 10:39 AM, Prashant Sharma  wrote:
> I think all that is needed is an log4j.properties on the classpath
> http://logging.apache.org/log4j/1.2/faq.html#noconfig
>
>
> On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell 
> wrote:
>>
>> Hey Wenlei,
>>
>> There is some issue in master that is repressing the log output - I'm
>> trying to debug it before we release 0.8.1. Can you explain exactly
>> how you are running Spark? Are you running the shell or are you
>> running a standalone application?
>>
>> - Patrick
>>
>> On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie  wrote:
>> > Hi,
>> >
>> > I remember Spark used to print detailed error log into the stderr (e.g.
>> > constructing RDD, evaluate it, how much memory each partition consumes).
>> > But
>> > I cannot find it anymore but only with the following information:
>> >
>> > SLF4J: Class path contains multiple SLF4J bindings.
>> > SLF4J: Found binding in
>> >
>> > [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> > SLF4J: Found binding in
>> >
>> > [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> > explanation.
>> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> > log4j:WARN No appenders could be found for logger
>> > (akka.event.slf4j.Slf4jEventHandler).
>> > log4j:WARN Please initialize the log4j system properly.
>> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
>> > more info.
>> >
>> >
>> > What should I do for it?
>> >
>> > Thanks,
>> >
>> > Wenlei
>
>
>
>
> --
> s


Re: Spark Error Log

2013-11-28 Thread Prashant Sharma
I think all that is needed is an log4j.properties on the classpath
http://logging.apache.org/log4j/1.2/faq.html#noconfig


On Thu, Nov 28, 2013 at 11:52 PM, Patrick Wendell wrote:

> Hey Wenlei,
>
> There is some issue in master that is repressing the log output - I'm
> trying to debug it before we release 0.8.1. Can you explain exactly
> how you are running Spark? Are you running the shell or are you
> running a standalone application?
>
> - Patrick
>
> On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie  wrote:
> > Hi,
> >
> > I remember Spark used to print detailed error log into the stderr (e.g.
> > constructing RDD, evaluate it, how much memory each partition consumes).
> But
> > I cannot find it anymore but only with the following information:
> >
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> >
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> >
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > log4j:WARN No appenders could be found for logger
> > (akka.event.slf4j.Slf4jEventHandler).
> > log4j:WARN Please initialize the log4j system properly.
> > log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> > more info.
> >
> >
> > What should I do for it?
> >
> > Thanks,
> >
> > Wenlei
>



-- 
s


Re: Spark Error Log

2013-11-28 Thread Patrick Wendell
Hey Wenlei,

There is some issue in master that is repressing the log output - I'm
trying to debug it before we release 0.8.1. Can you explain exactly
how you are running Spark? Are you running the shell or are you
running a standalone application?

- Patrick

On Thu, Nov 28, 2013 at 12:54 AM, Wenlei Xie  wrote:
> Hi,
>
> I remember Spark used to print detailed error log into the stderr (e.g.
> constructing RDD, evaluate it, how much memory each partition consumes). But
> I cannot find it anymore but only with the following information:
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> log4j:WARN No appenders could be found for logger
> (akka.event.slf4j.Slf4jEventHandler).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
>
>
> What should I do for it?
>
> Thanks,
>
> Wenlei


Spark Error Log

2013-11-28 Thread Wenlei Xie
Hi,

I remember Spark used to print detailed error log into the stderr (e.g.
constructing RDD, evaluate it, how much memory each partition consumes).
But I cannot find it anymore but only with the following information:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/u/ytian/wenlei/dynamicGraph/graphx/examples/target/scala-2.9.3/spark-examples-assembly-0.9.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/u/ytian/wenlei/dynamicGraph/graphx/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger
(akka.event.slf4j.Slf4jEventHandler).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.


What should I do for it?

Thanks,

Wenlei