Github user stanzhai commented on the issue:

    https://github.com/apache/spark/pull/18544
  
    @cloud-fan 
    
    User's hive UDFs are registered in externalCatalog which not exists in 
functionRegistry.
    
    It will throws a NoSuchFunctionException when an exception is encountered 
while loading a hive UDF.
    
    But we should throw the original exception.
    
    So, I just fix the issue by:
    
    ```
    if (functionRegistry.functionExists(funcName)) {
      throw error
    } else {
      ...
    }
    ```
    
    changed to:
    
    ```
    if (super.functionExists(name)) {
      throw error
    } else {
      ...
    }
    ```
    
    The following is implementation of `super.functionExists`
    
    ```
    def functionExists(name: FunctionIdentifier): Boolean = {
      val db = formatDatabaseName(name.database.getOrElse(getCurrentDatabase))
      requireDbExists(db)
      functionRegistry.functionExists(name) ||
        externalCatalog.functionExists(db, name.funcName)
    }
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to