mzhang-code commented on a change in pull request #28957:
URL: https://github.com/apache/spark/pull/28957#discussion_r574274413



##########
File path: docs/rdd-programming-guide.md
##########
@@ -276,7 +276,7 @@ $ PYSPARK_DRIVER_PYTHON=jupyter 
PYSPARK_DRIVER_PYTHON_OPTS=notebook ./bin/pyspar
 
 You can customize the `ipython` or `jupyter` commands by setting 
`PYSPARK_DRIVER_PYTHON_OPTS`.
 
-After the Jupyter Notebook server is launched, you can create a new "Python 2" 
notebook from
+After the Jupyter Notebook server is launched, you can create a new notebook 
from
 the "Files" tab. Inside the notebook, you can input the command `%pylab 
inline` as part of
 your notebook before you start to try Spark from the Jupyter notebook.

Review comment:
       It's been long time not using pyspark, I'm glad that this is fixed in 
spark 3.0. Thanks.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to