Repository: spark
Updated Branches:
  refs/heads/branch-2.1 6b2301b89 -> 820847008


[MINOR][DOC] Fix typos in the 'configuration', 'monitoring' and 
'sql-programming-guide' documentation

## What changes were proposed in this pull request?

Fix typos in the 'configuration', 'monitoring' and 'sql-programming-guide' 
documentation.

## How was this patch tested?
Manually.

Author: Weiqing Yang <yangweiqing...@gmail.com>

Closes #15886 from weiqingy/fixTypo.

(cherry picked from commit 241e04bc03efb1379622c0c84299e617512973ac)
Signed-off-by: Sean Owen <so...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/82084700
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/82084700
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/82084700

Branch: refs/heads/branch-2.1
Commit: 8208470084153f0be6818f66309f63dcdcb16519
Parents: 6b2301b
Author: Weiqing Yang <yangweiqing...@gmail.com>
Authored: Wed Nov 16 10:34:56 2016 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Wed Nov 16 10:35:05 2016 +0000

----------------------------------------------------------------------
 docs/configuration.md         | 2 +-
 docs/monitoring.md            | 2 +-
 docs/sql-programming-guide.md | 6 +++---
 3 files changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/82084700/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index d0acd94..e0c6613 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -1916,7 +1916,7 @@ showDF(properties, numRows = 200, truncate = FALSE)
   <td><code>spark.r.heartBeatInterval</code></td>
   <td>100</td>
   <td>
-    Interval for heartbeats sents from SparkR backend to R process to prevent 
connection timeout.
+    Interval for heartbeats sent from SparkR backend to R process to prevent 
connection timeout.
   </td>
 </tr>
 

http://git-wip-us.apache.org/repos/asf/spark/blob/82084700/docs/monitoring.md
----------------------------------------------------------------------
diff --git a/docs/monitoring.md b/docs/monitoring.md
index 5bc5e18..2eef456 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -41,7 +41,7 @@ directory must be supplied in the 
`spark.history.fs.logDirectory` configuration
 and should contain sub-directories that each represents an application's event 
logs.
 
 The spark jobs themselves must be configured to log events, and to log them to 
the same shared,
-writeable directory. For example, if the server was configured with a log 
directory of
+writable directory. For example, if the server was configured with a log 
directory of
 `hdfs://namenode/shared/spark-logs`, then the client-side options would be:
 
 ```

http://git-wip-us.apache.org/repos/asf/spark/blob/82084700/docs/sql-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index b9be7a7..ba3e55f 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -222,9 +222,9 @@ The `sql` function enables applications to run SQL queries 
programmatically and
 
 ## Global Temporary View
 
-Temporay views in Spark SQL are session-scoped and will disappear if the 
session that creates it
+Temporary views in Spark SQL are session-scoped and will disappear if the 
session that creates it
 terminates. If you want to have a temporary view that is shared among all 
sessions and keep alive
-until the Spark application terminiates, you can create a global temporary 
view. Global temporary
+until the Spark application terminates, you can create a global temporary 
view. Global temporary
 view is tied to a system preserved database `global_temp`, and we must use the 
qualified name to
 refer it, e.g. `SELECT * FROM global_temp.view1`.
 
@@ -1029,7 +1029,7 @@ following command:
 bin/spark-shell --driver-class-path postgresql-9.4.1207.jar --jars 
postgresql-9.4.1207.jar
 {% endhighlight %}
 
-Tables from the remote database can be loaded as a DataFrame or Spark SQL 
Temporary table using
+Tables from the remote database can be loaded as a DataFrame or Spark SQL 
temporary view using
 the Data Sources API. Users can specify the JDBC connection properties in the 
data source options.
 <code>user</code> and <code>password</code> are normally provided as 
connection properties for
 logging into the data sources. In addition to the connection properties, Spark 
also supports


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to