[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15964087#comment-15964087 ] Jifeng Yin commented on SPARK-10057: It seems to be due to https://issues-test.apache.org/jira/browse/PARQUET-369 (fixed in 2015.12). Right? > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0, 1.6.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15865601#comment-15865601 ] Samir Ouldsaadi commented on SPARK-10057: - Same issue happening in spark 2.1.0 to reproduce: scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc) warning: there was one deprecation warning; re-run with -deprecation for details sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@790d629a scala> import sqlContext._ import sqlContext._ scala> case class Person(name: String, age: Int) defined class Person scala> val people = sc.textFile("/opt/spark-2.1.0-bin-hadoop2.7/examples/src/main/resources/people.txt").map(_.split(",")).map(p=> Person(p(0), p(1).trim.toInt)) people: org.apache.spark.rdd.RDD[Person] = MapPartitionsRDD[3] at map at :31 scala> people.toDF().registerTempTable("people") warning: there was one deprecation warning; re-run with -deprecation for details scala> people.toDF().write.parquet("people.parquet") SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. scala> > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0, 1.6.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15824363#comment-15824363 ] Ravi Raghav commented on SPARK-10057: - Issue is due to slf4j-simple is included in the spark (version 2.5.4) dependency as [test] hence it wouldnt be available for testing. Resolution is to include the slf4j-simple in project dependency explicitly, org.slf4j slf4j-simple 1.7.7 > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0, 1.6.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15506972#comment-15506972 ] Semet commented on SPARK-10057: --- Hello I confirm we have this issue on Spark 1.6.1 when writing a parquet file. There are some information on [this stackoverflow|http://stackoverflow.com/questions/33832804/spark-1-5-2-and-slf4j-staticloggerbinder] > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0, 1.6.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15030870#comment-15030870 ] Alexandru Rosianu commented on SPARK-10057: --- +1 I have the same Spark and Hadoop versions. I added `"org.slf4j" % "slf4j-log4j12" % "1.7.10",` in build.sbt but it didn't help. > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15022602#comment-15022602 ] Stephen Carman commented on SPARK-10057: Hi, I think I've found a way to reproduce this. I believe it has something to do with the fact that spark overrides the Jul logger in Parquet relation and sets their own SLF4jBridgeHandler. I only get this warning when I try and load something via parquet. I've tested this via local parquet files, parquet files in S3 and also tested a negative case against Json files, both local and in S3 and Text files both local and in S3. In the text and JSON this warning never occurs, but in parquet it always occurs every time no matter where I am loading the data from. {code:java} scala> sc.setLogLevel("WARN") scala> val d = sc.parallelize(Array[Int](1,2,3,4,5)) d: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at parallelize at :21 scala> val ddf = d.toDF() ddf: org.apache.spark.sql.DataFrame = [_1: int] scala> ddf.write.parquet("/home/scarman/data/test.parquet") SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. {code} This was done on Spark 1.5.2 with Hadoop 2.7.1, whatever the default parquet dependency is. Hopefully this helps, it's not code breaking but it's very annoying. > Faill to load class org.slf4j.impl.StaticLoggerBinder > - > > Key: SPARK-10057 > URL: https://issues.apache.org/jira/browse/SPARK-10057 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.5.0 >Reporter: Davies Liu > > Some loggings are dropped, because it can't load class > "org.slf4j.impl.StaticLoggerBinder" > {code} > SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". > SLF4J: Defaulting to no-operation (NOP) logger implementation > SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further > details. > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10057) Faill to load class org.slf4j.impl.StaticLoggerBinder
[ https://issues.apache.org/jira/browse/SPARK-10057?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14699842#comment-14699842 ] Yin Huai commented on SPARK-10057: -- related to http://www.slf4j.org/codes.html#StaticLoggerBinder? Faill to load class org.slf4j.impl.StaticLoggerBinder - Key: SPARK-10057 URL: https://issues.apache.org/jira/browse/SPARK-10057 Project: Spark Issue Type: Bug Affects Versions: 1.5.0 Reporter: Davies Liu Some loggings are dropped, because it can't load class org.slf4j.impl.StaticLoggerBinder {code} SLF4J: Failed to load class org.slf4j.impl.StaticLoggerBinder. SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org