Babulal created SPARK-10754:
-------------------------------

             Summary: table and column name are case sensitive when json 
Dataframe was registered as tempTable using JavaSparkContext. 
                 Key: SPARK-10754
                 URL: https://issues.apache.org/jira/browse/SPARK-10754
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.4.1, 1.3.1, 1.3.0
         Environment: Linux ,Hadoop Version 1.3
            Reporter: Babulal


Create a dataframe using json data source 

        SparkConf conf=new 
SparkConf().setMaster("spark://xyz:7077")).setAppName("Spark Tabble");
        JavaSparkContext javacontext=new JavaSparkContext(conf);
        SQLContext sqlContext=new SQLContext(javacontext);
        
        DataFrame df = 
sqlContext.jsonFile("/user/root/examples/src/main/resources/people.json");
                
        df.registerTempTable("sparktable");

        
        Run the Query
        
        sqlContext.sql("select * from sparktable").show()    // this will PASs
        
        
        sqlContext.sql("select * from sparkTable").show()    /// This will FAIL 
        
        java.lang.RuntimeException: Table Not Found: sparkTable
        at scala.sys.package$.error(package.scala:27)
        at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:115)
        at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:115)
        at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
        at scala.collection.AbstractMap.getOrElse(Map.scala:58)
        at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:115)
        at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:233)

                
                
                Note :- Job is triggered from spark submit 

Same thing work with scala Spark Context.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to