Repository: spark
Updated Branches:
  refs/heads/branch-2.0 8629537cc -> f7158c482


[MINOR] [PYSPARK] [EXAMPLES] Changed examples to use SparkSession.sparkContext 
instead of _sc

## What changes were proposed in this pull request?

Some PySpark examples need a SparkContext and get it by accessing _sc directly 
from the session.  These examples should use the provided property 
`sparkContext` in `SparkSession` instead.

## How was this patch tested?
Ran modified examples

Author: Bryan Cutler <cutl...@gmail.com>

Closes #13303 from BryanCutler/pyspark-session-sparkContext-MINOR.

(cherry picked from commit 9c297df3d4d5fa4bbfdffdaad15f362586db384b)
Signed-off-by: Davies Liu <davies....@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f7158c48
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f7158c48
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f7158c48

Branch: refs/heads/branch-2.0
Commit: f7158c4828320e18aedd4369832da16c759b90fb
Parents: 8629537
Author: Bryan Cutler <cutl...@gmail.com>
Authored: Wed May 25 14:29:14 2016 -0700
Committer: Davies Liu <davies....@gmail.com>
Committed: Wed May 25 14:29:23 2016 -0700

----------------------------------------------------------------------
 examples/src/main/python/als.py                 | 2 +-
 examples/src/main/python/avro_inputformat.py    | 2 +-
 examples/src/main/python/parquet_inputformat.py | 2 +-
 examples/src/main/python/pi.py                  | 2 +-
 examples/src/main/python/transitive_closure.py  | 2 +-
 5 files changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f7158c48/examples/src/main/python/als.py
----------------------------------------------------------------------
diff --git a/examples/src/main/python/als.py b/examples/src/main/python/als.py
index 81562e2..80290e7 100755
--- a/examples/src/main/python/als.py
+++ b/examples/src/main/python/als.py
@@ -67,7 +67,7 @@ if __name__ == "__main__":
         .appName("PythonALS")\
         .getOrCreate()
 
-    sc = spark._sc
+    sc = spark.sparkContext
 
     M = int(sys.argv[1]) if len(sys.argv) > 1 else 100
     U = int(sys.argv[2]) if len(sys.argv) > 2 else 500

http://git-wip-us.apache.org/repos/asf/spark/blob/f7158c48/examples/src/main/python/avro_inputformat.py
----------------------------------------------------------------------
diff --git a/examples/src/main/python/avro_inputformat.py 
b/examples/src/main/python/avro_inputformat.py
index 3f65e8f..4422f9e 100644
--- a/examples/src/main/python/avro_inputformat.py
+++ b/examples/src/main/python/avro_inputformat.py
@@ -70,7 +70,7 @@ if __name__ == "__main__":
         .appName("AvroKeyInputFormat")\
         .getOrCreate()
 
-    sc = spark._sc
+    sc = spark.sparkContext
 
     conf = None
     if len(sys.argv) == 3:

http://git-wip-us.apache.org/repos/asf/spark/blob/f7158c48/examples/src/main/python/parquet_inputformat.py
----------------------------------------------------------------------
diff --git a/examples/src/main/python/parquet_inputformat.py 
b/examples/src/main/python/parquet_inputformat.py
index 2f09f4d..29a1ac2 100644
--- a/examples/src/main/python/parquet_inputformat.py
+++ b/examples/src/main/python/parquet_inputformat.py
@@ -53,7 +53,7 @@ if __name__ == "__main__":
         .appName("ParquetInputFormat")\
         .getOrCreate()
 
-    sc = spark._sc
+    sc = spark.sparkContext
 
     parquet_rdd = sc.newAPIHadoopFile(
         path,

http://git-wip-us.apache.org/repos/asf/spark/blob/f7158c48/examples/src/main/python/pi.py
----------------------------------------------------------------------
diff --git a/examples/src/main/python/pi.py b/examples/src/main/python/pi.py
index 5db03e4..b39d710 100755
--- a/examples/src/main/python/pi.py
+++ b/examples/src/main/python/pi.py
@@ -32,7 +32,7 @@ if __name__ == "__main__":
         .appName("PythonPi")\
         .getOrCreate()
 
-    sc = spark._sc
+    sc = spark.sparkContext
 
     partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
     n = 100000 * partitions

http://git-wip-us.apache.org/repos/asf/spark/blob/f7158c48/examples/src/main/python/transitive_closure.py
----------------------------------------------------------------------
diff --git a/examples/src/main/python/transitive_closure.py 
b/examples/src/main/python/transitive_closure.py
index 37c41dc..d88ea94 100755
--- a/examples/src/main/python/transitive_closure.py
+++ b/examples/src/main/python/transitive_closure.py
@@ -46,7 +46,7 @@ if __name__ == "__main__":
         .appName("PythonTransitiveClosure")\
         .getOrCreate()
 
-    sc = spark._sc
+    sc = spark.sparkContext
 
     partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
     tc = sc.parallelize(generateGraph(), partitions).cache()


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to