Repository: spark
Updated Branches:
  refs/heads/master 081ac69f3 -> 6fa4ac1b0


[Branch-1.3] [DOC] doc fix for date

Trivial fix.

Author: Daoyuan Wang <daoyuan.w...@intel.com>

Closes #4400 from adrian-wang/docdate and squashes the following commits:

31bbe40 [Daoyuan Wang] doc fix for date


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6fa4ac1b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/6fa4ac1b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/6fa4ac1b

Branch: refs/heads/master
Commit: 6fa4ac1b007a545201d82603f09b0573f529a4e6
Parents: 081ac69
Author: Daoyuan Wang <daoyuan.w...@intel.com>
Authored: Thu Feb 5 12:42:27 2015 -0800
Committer: Reynold Xin <r...@databricks.com>
Committed: Thu Feb 5 12:42:27 2015 -0800

----------------------------------------------------------------------
 docs/sql-programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/6fa4ac1b/docs/sql-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index 350df9a..38f617d 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -1108,7 +1108,7 @@ in Hive deployments.
   have the same input format.
 * Non-equi outer join: For the uncommon use case of using outer joins with 
non-equi join conditions
   (e.g. condition "`key < 10`"), Spark SQL will output wrong result for the 
`NULL` tuple.
-* `UNION` type and `DATE` type
+* `UNION` type
 * Unique join
 * Single query multi insert
 * Column statistics collecting: Spark SQL does not piggyback scans to collect 
column statistics at


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to