Author: srowen
Date: Mon Mar 16 13:49:54 2015
New Revision: 1666996

URL: http://svn.apache.org/r1666996
Log:
Removed extra word 'extended'

Modified:
    spark/site/releases/spark-release-1-3-0.html

Modified: spark/site/releases/spark-release-1-3-0.html
URL: 
http://svn.apache.org/viewvc/spark/site/releases/spark-release-1-3-0.html?rev=1666996&r1=1666995&r2=1666996&view=diff
==============================================================================
--- spark/site/releases/spark-release-1-3-0.html (original)
+++ spark/site/releases/spark-release-1-3-0.html Mon Mar 16 13:49:54 2015
@@ -195,7 +195,7 @@
 <h2 id="upgrading-to-spark-13">Upgrading to Spark 1.3</h2>
 <p>Spark 1.3 is binary compatible with Spark 1.X releases, so no code changes 
are necessary. This excludes API’s marked explicitly as unstable.</p>
 
-<p>As part of stabilizing the Spark SQL API, the <code>SchemaRDD</code> class 
has been extended renamed to <code>DataFrame</code>. Spark SQL&#8217;s <a 
href="http://spark.apache.org/docs/1.3.0/sql-programming-guide.html#migration-guide";>migration
 guide</a> describes the upgrade process in detail. Spark SQL also now requires 
that column identifiers which use reserved words (such as &#8220;string&#8221; 
or &#8220;table&#8221;) be escaped using backticks.</p>
+<p>As part of stabilizing the Spark SQL API, the <code>SchemaRDD</code> class 
has been renamed to <code>DataFrame</code>. Spark SQL&#8217;s <a 
href="http://spark.apache.org/docs/1.3.0/sql-programming-guide.html#migration-guide";>migration
 guide</a> describes the upgrade process in detail. Spark SQL also now requires 
that column identifiers which use reserved words (such as &#8220;string&#8221; 
or &#8220;table&#8221;) be escaped using backticks.</p>
 
 <h3 id="known-issues">Known Issues</h3>
 <p>This release has few known issues which will be addressed in Spark 
1.3.1:</p>



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to