It seems that I need to have the log4j.properties file in the current directory
So if I launch spark-shell in spark/conf I see that INFO is not displayed. On Thu, Nov 7, 2013 at 2:16 PM, Shay Seng <s...@1618labs.com> wrote: > When is the log4j.properties file read... and how can I verify that it is > begin read? > Do I need to have the log4j.properties file set before the cluster is > launched? > > I have the following : > > root@ ~/spark] more ./conf/log4j.properties > # Set everything to be logged to the console > log4j.rootCategory=WARN, console > log4j.appender.console=org.apache.log4j.ConsoleAppender > log4j.appender.console.layout=org.apache.log4j.PatternLayout > log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p > %c{1}: %m%n > > # Ignore messages below warning level from Jetty, because it's a bit > verbose > log4j.logger.org.eclipse.jetty=WARN > > But I'm still seeing INFO logging. > > root@ ~/spark] ./spark-shell > Welcome to > ____ __ > / __/__ ___ _____/ /__ > _\ \/ _ \/ _ `/ __/ '_/ > /___/ .__/\_,_/_/ /_/\_\ version 0.8.0 > /_/ > > Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_45) > Initializing interpreter... > 13/11/07 22:13:38 INFO server.Server: jetty-7.x.y-SNAPSHOT > 13/11/07 22:13:38 INFO server.AbstractConnector: Started > SocketConnector@0.0.0.0:41012 > Creating SparkContext... > 13/11/07 22:14:24 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started > 13/11/07 22:14:25 INFO spark.SparkEnv: Registering BlockManagerMaster > 13/11/07 22:14:25 INFO storage.MemoryStore: MemoryStore started with > capacity 3.8 GB. > 13/11/07 22:14:25 INFO storage.DiskStore: Created local directory at > /mnt/spark/spark-local-20131107221425-152f > 13/11/07 22:14:25 INFO storage.DiskStore: Created local directory at > /mnt2/spark/spark-local-20131107221425-a692 > 13/11/07 22:14:25 INFO network.ConnectionManager: Bound socket to port > 45595 with id = > ConnectionManagerId(ip-10-138-103-193.ap-southeast-1.compute.internal,45595) > 13/11/07 22:14:25 INFO storage.BlockManagerMaster: Trying to register > BlockManager > 13/11/07 22:14:25 INFO storage.BlockManagerMasterActor$BlockManagerInfo: > Registering block manager > ip-10-138-103-193.ap-southeast-1.compute.internal:45595 with 3.8 GB RAM > > > On Wed, Nov 6, 2013 at 7:04 AM, Shay Seng <s...@1618labs.com> wrote: > >> By the right place you mean in the conf directory right.. >> >> I'll give it another try when I relaunch my cluster this morning.... >> Weird. >> >> When I first modified the file, it looked like it worked, but I can't >> remember exactly... Than I had a reply hang and I had to ctrl c out of >> that... After that the lighting started back up. >> On Nov 6, 2013 12:17 AM, "Reynold Xin" <r...@apache.org> wrote: >> >>> Are you sure you put the log4j file in the right place? I just tried >>> this with your configuration file, and this is what I see: >>> >>> >>> rxin @ rxin-air : /scratch/rxin/incubator-spark >>> > ./spark-shell >>> Welcome to >>> ____ __ >>> / __/__ ___ _____/ /__ >>> _\ \/ _ \/ _ `/ __/ '_/ >>> /___/ .__/\_,_/_/ /_/\_\ version 0.9.0-SNAPSHOT >>> /_/ >>> >>> Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java >>> 1.6.0_65) >>> Initializing interpreter... >>> Creating SparkContext... >>> Spark context available as sc. >>> Type in expressions to have them evaluated. >>> Type :help for more information. >>> >>> scala> sc.parallelize(1 to 10, 2).count >>> res0: Long = 10 >>> >>> >>> >>> On Tue, Nov 5, 2013 at 2:36 PM, Shay Seng <s...@1618labs.com> wrote: >>> >>>> Hi, >>>> >>>> I added a log4j.properties file in <spark>/conf >>>> more ./spark/conf/log4j.properties >>>> # Set everything to be logged to the console >>>> log4j.rootCategory=WARN, console >>>> log4j.appender.console=org.apache.log4j.ConsoleAppender >>>> log4j.appender.console.layout=org.apache.log4j.PatternLayout >>>> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} >>>> %p %c{1}: %m%n >>>> >>>> # Ignore messages below warning level from Jetty, because it's a bit >>>> verbose >>>> log4j.logger.org.eclipse.jetty=WARN >>>> >>>> But yet, when I launch the REPL, I still see INFO logs... what am I >>>> missing here? >>>> >>>> Welcome to >>>> ____ __ >>>> / __/__ ___ _____/ /__ >>>> _\ \/ _ \/ _ `/ __/ '_/ >>>> /___/ .__/\_,_/_/ /_/\_\ version 0.8.0 >>>> /_/ >>>> >>>> Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_45) >>>> Initializing interpreter... >>>> 13/11/05 22:33:11 INFO server.Server: jetty-7.x.y-SNAPSHOT >>>> 13/11/05 22:33:11 INFO server.AbstractConnector: Started >>>> SocketConnector@0.0.0.0:49376 >>>> Creating SparkContext... >>>> 13/11/05 22:33:21 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler >>>> started >>>> 13/11/05 22:33:21 INFO spark.SparkEnv: Registering BlockManagerMaster >>>> 13/11/05 22:33:21 INFO storage.MemoryStore: MemoryStore started with >>>> capacity 3.8 GB. >>>> 13/11/05 22:33:21 INFO storage.DiskStore: Created local directory at >>>> /mnt/spark/spark-local-20131105223321-9086 >>>> 13/11/05 22:33:21 INFO storage.DiskStore: Created local directory at >>>> /mnt2/spark/spark-local-20131105223321-b94d >>>> >>> >>> >