[ https://issues.apache.org/jira/browse/SPARK-15270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Davies Liu resolved SPARK-15270. -------------------------------- Resolution: Fixed Fix Version/s: 2.0.0 Issue resolved by pull request 13056 [https://github.com/apache/spark/pull/13056] > Creating HiveContext does not work > ---------------------------------- > > Key: SPARK-15270 > URL: https://issues.apache.org/jira/browse/SPARK-15270 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.0.0 > Reporter: Piotr Milanowski > Priority: Blocker > Fix For: 2.0.0 > > > Built spark (commit c6d23b6604e85bcddbd1fb6a2c1c3edbfd2be2c1, branch-2.0) > with command: > /dev/make-distribution.sh -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver > -Dhadoop.version=2.6.0 -DskipTests > Launched master and slave, launched ./bin/pyspark > Creating hive context fails: > {code} > from pyspark.sql import HiveContext > hc = HiveContext(sc) > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File "spark-2.0/python/pyspark/sql/context.py", line 458, in __init__ > sparkSession = SparkSession.withHiveSupport(sparkContext) > File "spark-2.0/python/pyspark/sql/session.py", line 192, in withHiveSupport > jsparkSession = > sparkContext._jvm.SparkSession.withHiveSupport(sparkContext._jsc.sc()) > File "spark-2.0/python/lib/py4j-0.9.2-src.zip/py4j/java_gateway.py", line > 1048, in __getattr__ > py4j.protocol.Py4JError: org.apache.spark.sql.SparkSession.withHiveSupport > does not exist in the JVM > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org