Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13751#discussion_r67604590
  
    --- Diff: docs/sparkr.md ---
    @@ -158,20 +152,19 @@ write.df(people, path="people.parquet", 
source="parquet", mode="overwrite")
     
     ### From Hive tables
     
    -You can also create SparkR DataFrames from Hive tables. To do this we will 
need to create a HiveContext which can access tables in the Hive MetaStore. 
Note that Spark should have been built with [Hive 
support](building-spark.html#building-with-hive-and-jdbc-support) and more 
details on the difference between SQLContext and HiveContext can be found in 
the [SQL programming 
guide](sql-programming-guide.html#starting-point-sqlcontext).
    +You can also create SparkDataFrames from Hive tables. To do this we will 
need to create a SparkSession with Hive support which can access tables in the 
Hive MetaStore. Note that Spark should have been built with [Hive 
support](building-spark.html#building-with-hive-and-jdbc-support) and more 
details can be found in the [SQL programming 
guide](sql-programming-guide.html#starting-point-sqlcontext). In SparkR, by 
default it will attempt to create a SparkSession with Hive support enabled 
(`enableHiveSupport = TRUE`).
    --- End diff --
    
    This will eventually be broken in the following PR for the SQL Programming  
Guide.
    
    
https://github.com/apache/spark/pull/13592/files#diff-d8aa7a37d17a1227cba38c99f9f22511R55
    
    - `#starting-point-sqlcontext` -> `#starting-point-sparksession`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to