[ 
https://issues.apache.org/jira/browse/SPARK-16437?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xin Ren updated SPARK-16437:
----------------------------
    Component/s:     (was: SparkR)

> SparkR read.df() from parquet got error: SLF4J: Failed to load class 
> "org.slf4j.impl.StaticLoggerBinder"
> --------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16437
>                 URL: https://issues.apache.org/jira/browse/SPARK-16437
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Xin Ren
>            Priority: Minor
>
> build SparkR with command
> {code}
> build/mvn -DskipTests -Psparkr package
> {code}
> start SparkR console
> {code}
> ./bin/sparkR
> {code}
> then get error
> {code}
>  Welcome to
>     ____              __
>    / __/__  ___ _____/ /__
>   _\ \/ _ \/ _ `/ __/  '_/
>  /___/ .__/\_,_/_/ /_/\_\   version  2.0.0-SNAPSHOT
>     /_/
>  SparkSession available as 'spark'.
> >
> >
> > library(SparkR)
> >
> > df <- read.df("examples/src/main/resources/users.parquet")
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
> details.
> >
> >
> > head(df)
> 16/07/07 23:20:54 WARN ParquetRecordReader: Can not initialize counter due to 
> context is not a instance of TaskInputOutputContext, but is 
> org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
>     name favorite_color favorite_numbers
> 1 Alyssa           <NA>     3, 9, 15, 20
> 2    Ben            red             NULL
> {code}
> Reference
> * seems need to add a lib from slf4j to point to older version
> http://stackoverflow.com/questions/7421612/slf4j-failed-to-load-class-org-slf4j-impl-staticloggerbinder
> * on slf4j official site: http://www.slf4j.org/codes.html#StaticLoggerBinder



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to