[ https://issues.apache.org/jira/browse/SPARK-17172?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15431409#comment-15431409 ]
Andrew Davidson commented on SPARK-17172: ----------------------------------------- Hi Sean I forgot about that older jira issue. I never resolved it. I am using juypter. I believe each notebook gets it own spark context. I googled around and found some old issue that seem to suggest that a hive and sql context where being created . I have not figure out how to either use a different database for the hive context or prevent the original spark context from being created. > pyspak hiveContext can not create UDF: Py4JJavaError: An error occurred while > calling None.org.apache.spark.sql.hive.HiveContext. > ---------------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-17172 > URL: https://issues.apache.org/jira/browse/SPARK-17172 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.6.2 > Environment: spark version: 1.6.2 > python version: 3.4.2 (v3.4.2:ab2c023a9432, Oct 5 2014, 20:42:22) > [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] > Reporter: Andrew Davidson > Attachments: hiveUDFBug.html, hiveUDFBug.ipynb > > > from pyspark.sql import HiveContext > sqlContext = HiveContext(sc) > # Define udf > from pyspark.sql.functions import udf > def scoreToCategory(score): > if score >= 80: return 'A' > elif score >= 60: return 'B' > elif score >= 35: return 'C' > else: return 'D' > > udfScoreToCategory=udf(scoreToCategory, StringType()) > throws exception > Py4JJavaError: An error occurred while calling > None.org.apache.spark.sql.hive.HiveContext. > : java.lang.RuntimeException: java.lang.RuntimeException: Unable to > instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org