[jira] [Commented] (SPARK-6603) SQLContext.registerFunction - SQLContext.udf.register

2015-03-30 Thread Reynold Xin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14387105#comment-14387105
 ] 

Reynold Xin commented on SPARK-6603:


How about not deprecating registerFunction, and have both? So Scala users when 
migrating to Python don't get confused as much.


 SQLContext.registerFunction - SQLContext.udf.register
 --

 Key: SPARK-6603
 URL: https://issues.apache.org/jira/browse/SPARK-6603
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Reporter: Reynold Xin
Assignee: Davies Liu

 We didn't change the Python implementation to use that. Maybe the best 
 strategy is to deprecate SQLContext.registerFunction, and just add 
 SQLContext.udf.register.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6603) SQLContext.registerFunction - SQLContext.udf.register

2015-03-30 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14387380#comment-14387380
 ] 

Apache Spark commented on SPARK-6603:
-

User 'davies' has created a pull request for this issue:
https://github.com/apache/spark/pull/5273

 SQLContext.registerFunction - SQLContext.udf.register
 --

 Key: SPARK-6603
 URL: https://issues.apache.org/jira/browse/SPARK-6603
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Reporter: Reynold Xin
Assignee: Davies Liu

 We didn't change the Python implementation to use that. Maybe the best 
 strategy is to deprecate SQLContext.registerFunction, and just add 
 SQLContext.udf.register.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org