[
https://issues.apache.org/jira/browse/SPARK-14698?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
poseidon updated SPARK-14698:
-
Description:
build spark 1.6.1 , and run it with 1.2.1 hive version,config mysql as
metastore server.
Start a thrift server , then in beeline , try to CREATE FUNCTION as HIVE SQL
UDF.
find out , can not add this FUNCTION to mysql metastore,but the function usage
goes well.
if you try to add it again , thrift server throw a alread Exist Exception.
[SPARK-10151][SQL] Support invocation of hive macro
add a if condition when runSqlHive, which will exec create function in
hiveexec. caused this problem.
was:
build spark 1.6.1 , and run it with 1.2.1 hive version,config mysql as
metastore server.
start a thrift server , then in beeline , try to create a function.
found out , can not add this create to metastore,but the function goes well.
if you try to add it again , thrift server throw a alread Exist Exception.
[SPARK-10151][SQL] Support invocation of hive macro
add a if condition when runSqlHive, which will exec create function in
hiveexec. caused this problem.
> CREATE FUNCTION cloud not add function to hive metastore
>
>
> Key: SPARK-14698
> URL: https://issues.apache.org/jira/browse/SPARK-14698
> Project: Spark
> Issue Type: Bug
> Components: SQL
>Affects Versions: 1.6.1
> Environment: spark1.6.1
>Reporter: poseidon
> Labels: easyfix
>
> build spark 1.6.1 , and run it with 1.2.1 hive version,config mysql as
> metastore server.
> Start a thrift server , then in beeline , try to CREATE FUNCTION as HIVE SQL
> UDF.
> find out , can not add this FUNCTION to mysql metastore,but the function
> usage goes well.
> if you try to add it again , thrift server throw a alread Exist Exception.
> [SPARK-10151][SQL] Support invocation of hive macro
> add a if condition when runSqlHive, which will exec create function in
> hiveexec. caused this problem.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org