[ https://issues.apache.org/jira/browse/SPARK-1458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Nicholas Chammas resolved SPARK-1458. ------------------------------------- Resolution: Fixed Fix Version/s: 1.0.0 This issue appears to be resolved in [this merge|https://github.com/apache/spark/pull/204/files#diff-364713d7776956cb8b0a771e9b62f82dR779]. > Add programmatic way to determine Spark version > ----------------------------------------------- > > Key: SPARK-1458 > URL: https://issues.apache.org/jira/browse/SPARK-1458 > Project: Spark > Issue Type: New Feature > Components: PySpark, Spark Core > Affects Versions: 0.9.0 > Reporter: Nicholas Chammas > Priority: Minor > Fix For: 1.0.0 > > > As discussed > [here|http://apache-spark-user-list.1001560.n3.nabble.com/programmatic-way-to-tell-Spark-version-td1929.html], > I think it would be nice if there was a way to programmatically determine > what version of Spark you are running. > The potential use cases are not that important, but they include: > # Branching your code based on what version of Spark is running. > # Checking your version without having to quit and restart the Spark shell. > Right now in PySpark, I believe the only way to determine your version is by > firing up the Spark shell and looking at the startup banner. -- This message was sent by Atlassian JIRA (v6.2#6252)