Ryan Blue created SPARK-19138:
---------------------------------

             Summary: Python: new HiveContext will use a stopped SparkContext
                 Key: SPARK-19138
                 URL: https://issues.apache.org/jira/browse/SPARK-19138
             Project: Spark
          Issue Type: Bug
          Components: PySpark
            Reporter: Ryan Blue


We have users that run a notebook cell that creates a new SparkContext to 
overwrite some of the default initial parameters:

{code:lang=python}
if 'sc' in globals():
    #Stop the running SparkContext if there is one running.
    sc.stop()

conf = SparkConf().setAppName("app")
#conf.set('spark.sql.shuffle.partitions', '2000')
sc = SparkContext(conf=conf)
sqlContext = HiveContext(sc)
{code}

In Spark 2.0, this creates an invalid SQLContext that uses the original 
SparkContext because the [HiveContext 
contstructor|https://github.com/apache/spark/blob/master/python/pyspark/sql/context.py#L514]
 uses SparkSession.getOrCreate that has the old SparkContext. A SparkSession 
should be invalidated and no longer returned by getOrCreate if its SparkContext 
has been stopped.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to