If you want to customize the logging behavior - the simplest way is to copy
conf/log4j.properties.tempate to conf/log4j.properties. Then you can go and
modify the log level in there. The spark shells should pick this up.




On Sun, Aug 3, 2014 at 6:16 AM, Sean Owen <so...@cloudera.com> wrote:

> That's just a template. Nothing consults that file by default.
>
> It's looking inside the Spark .jar. If you edit
> core/src/main/resources/org/apache/spark/log4j-defaults.properties and
> rebuild Spark, it will pick up those changes.
>
> I think you could also use the JVM argument
> "-Dlog4j.configuration=conf/log4j-defaults.properties" to force it to
> look at your local, edited props file.
>
> Someone  may have to correct me, but I think that in master right now,
> that means using --driver-java-options=".." to set an argument to the
> JVM that runs the shell?
>
> On Sun, Aug 3, 2014 at 2:07 PM, Gil Vernik <g...@il.ibm.com> wrote:
> > Hi,
> >
> > I would like to run spark-shell without any INFO messages printed.
> > To achieve this I edited /conf/log4j.properties and added line
> > log4j.rootLogger=OFF
> > that suppose to disable all logging.
> >
> > However, when I  run ./spark-shell I see the message
> > 4/08/03 16:02:15 INFO SecurityManager: Using Spark's default log4j
> profile:
> > org/apache/spark/log4j-defaults.properties
> >
> > And then spark-shell prints all INFO messages as is.
> >
> > What did i missed? Why spark-shell uses default log4j properties and not
> the
> > one defined in /conf directory?
> > Is there another solution to prevent spark-shell from printing INFO
> > messages?
> >
> > Thanks,
> > Gil.
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to