[ 
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15276100#comment-15276100
 ] 

Vijay Parmar commented on SPARK-15159:
--------------------------------------

1. I looked at the source code on  
"https://github.com/apache/spark/blob/master/R/pkg/R/SQLContext.R"; 
and found that on lines 193 and 194 there is "sparkRHivesc" other than this 
there is nowhere mention of it in the code.

I was bit confused whether this is the only change that needs to be done or 
something else also?

2. I didn't felt the need of any change in unit tests of SparkR.

Please let know you opinion or suggestion(s). So that I can proceed further on 
this.

> Remove usage of HiveContext in SparkR.
> --------------------------------------
>
>                 Key: SPARK-15159
>                 URL: https://issues.apache.org/jira/browse/SPARK-15159
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SparkR
>    Affects Versions: 1.6.1
>            Reporter: Sun Rui
>
> HiveContext is to be deprecated in 2.0.  Replace them with 
> SparkSession.withHiveSupport in SparkR



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to