One thing we ran into was that there was another log4j.properties earlier
in the classpath. For us, it was in our MapR/Hadoop conf.

If that is the case, something like the following could help you track it
down. The only thing to watch out for is that you might have to walk up the
classloader hierarchy.

ClassLoader cl = Thread.currentThread().getContextClassLoader();
URL loc = cl.getResource("/log4j.properties");
System.out.println("loc);

-Suren




On Tue, Jul 1, 2014 at 9:20 AM, Philip Limbeck <philiplimb...@gmail.com>
wrote:

> We changed the loglevel to DEBUG by replacing every INFO with DEBUG in
> /root/ephemeral-hdfs/conf/log4j.properties and propagating it to the
> cluster. There is some DEBUG output visible in both master and worker but
> nothing really interesting regarding stages or scheduling. Since we
> expected a little more than that, there could be 2 possibilites:
>   a) There is still some other unknown way to set the loglevel to debug
>   b) There is not that much log output to be expected in this direction, I
> looked for "logDebug" (The log wrapper in spark) in github with 84 results,
> which means that I doubt that there is not much else to expect.
>
> We actually just want to have a little more insight into the system
> behavior especially when using Shark since we ran into some serious
> concurrency issues with blocking queries. So much for the background why
> this is important to us.
>
>
> On Thu, Jun 26, 2014 at 3:30 AM, Aaron Davidson <ilike...@gmail.com>
> wrote:
>
>> If you're using the spark-ec2 scripts, you may have to change
>> /root/ephemeral-hdfs/conf/log4j.properties or something like that, as that
>> is added to the classpath before Spark's own conf.
>>
>>
>> On Wed, Jun 25, 2014 at 6:10 PM, Tobias Pfeiffer <t...@preferred.jp>
>> wrote:
>>
>>> I have a log4j.xml in src/main/resources with
>>>
>>> <?xml version="1.0" encoding="UTF-8" ?>
>>> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
>>> <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/";>
>>>     [...]
>>>     <root>
>>>         <priority value ="warn" />
>>>         <appender-ref ref="Console" />
>>>     </root>
>>> </log4j:configuration>
>>>
>>> and that is included in the jar I package with `sbt assembly`. That
>>> works fine for me, at least on the driver.
>>>
>>> Tobias
>>>
>>> On Wed, Jun 25, 2014 at 2:25 PM, Philip Limbeck <philiplimb...@gmail.com>
>>> wrote:
>>> > Hi!
>>> >
>>> > According to
>>> >
>>> https://spark.apache.org/docs/0.9.0/configuration.html#configuring-logging
>>> ,
>>> > changing log-level is just a matter of creating a log4j.properties
>>> (which is
>>> > in the classpath of spark) and changing log level there for the root
>>> logger.
>>> > I did this steps on every node in the cluster (master and worker
>>> nodes).
>>> > However, after restart there is still no debug output as desired, but
>>> only
>>> > the default info log level.
>>>
>>
>>
>


-- 

SUREN HIRAMAN, VP TECHNOLOGY
Velos
Accelerating Machine Learning

440 NINTH AVENUE, 11TH FLOOR
NEW YORK, NY 10001
O: (917) 525-2466 ext. 105
F: 646.349.4063
E: suren.hiraman@v <suren.hira...@sociocast.com>elos.io
W: www.velos.io

Reply via email to