spark git commit: [SPARK-17242][DOCUMENT] Update links of external dstream projects

2016-08-25 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/branch-2.0 73014a2aa -> 27ed6d5dc


[SPARK-17242][DOCUMENT] Update links of external dstream projects

## What changes were proposed in this pull request?

Updated links of external dstream projects.

## How was this patch tested?

Just document changes.

Author: Shixiong Zhu 

Closes #14814 from zsxwing/dstream-link.

(cherry picked from commit 341e0e778dff8c404b47d34ee7661b658bb91880)
Signed-off-by: Reynold Xin 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/27ed6d5d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/27ed6d5d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/27ed6d5d

Branch: refs/heads/branch-2.0
Commit: 27ed6d5dcd521b4ff1ebe777b03a03ba103d6e76
Parents: 73014a2
Author: Shixiong Zhu 
Authored: Thu Aug 25 21:08:42 2016 -0700
Committer: Reynold Xin 
Committed: Thu Aug 25 21:08:48 2016 -0700

--
 docs/streaming-programming-guide.md | 8 ++--
 1 file changed, 2 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/27ed6d5d/docs/streaming-programming-guide.md
--
diff --git a/docs/streaming-programming-guide.md 
b/docs/streaming-programming-guide.md
index 14e1744..b92ca92 100644
--- a/docs/streaming-programming-guide.md
+++ b/docs/streaming-programming-guide.md
@@ -656,7 +656,7 @@ methods for creating DStreams from files as input sources.
Python API 
`fileStream` is not available in the Python API, only  `textFileStream` is  
   available.
 
 - **Streams based on Custom Receivers:** DStreams can be created with data 
streams received through custom receivers. See the [Custom Receiver
-  Guide](streaming-custom-receivers.html) and [DStream 
Akka](https://github.com/spark-packages/dstream-akka) for more details.
+  Guide](streaming-custom-receivers.html) for more details.
 
 - **Queue of RDDs as a Stream:** For testing a Spark Streaming application 
with test data, one can also create a DStream based on a queue of RDDs, using 
`streamingContext.queueStream(queueOfRDDs)`. Each RDD pushed into the queue 
will be treated as a batch of data in the DStream, and processed like a stream.
 
@@ -2383,11 +2383,7 @@ additional effort may be necessary to achieve 
exactly-once semantics. There are
 - [Kafka Integration Guide](streaming-kafka-integration.html)
 - [Kinesis Integration Guide](streaming-kinesis-integration.html)
 - [Custom Receiver Guide](streaming-custom-receivers.html)
-* External DStream data sources:
-- [DStream MQTT](https://github.com/spark-packages/dstream-mqtt)
-- [DStream Twitter](https://github.com/spark-packages/dstream-twitter)
-- [DStream Akka](https://github.com/spark-packages/dstream-akka)
-- [DStream ZeroMQ](https://github.com/spark-packages/dstream-zeromq)
+* Third-party DStream data sources can be found in [Spark 
Packages](https://spark-packages.org/)
 * API documentation
   - Scala docs
 * 
[StreamingContext](api/scala/index.html#org.apache.spark.streaming.StreamingContext)
 and


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-17242][DOCUMENT] Update links of external dstream projects

2016-08-25 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/master b964a172a -> 341e0e778


[SPARK-17242][DOCUMENT] Update links of external dstream projects

## What changes were proposed in this pull request?

Updated links of external dstream projects.

## How was this patch tested?

Just document changes.

Author: Shixiong Zhu 

Closes #14814 from zsxwing/dstream-link.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/341e0e77
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/341e0e77
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/341e0e77

Branch: refs/heads/master
Commit: 341e0e778dff8c404b47d34ee7661b658bb91880
Parents: b964a17
Author: Shixiong Zhu 
Authored: Thu Aug 25 21:08:42 2016 -0700
Committer: Reynold Xin 
Committed: Thu Aug 25 21:08:42 2016 -0700

--
 docs/streaming-programming-guide.md | 8 ++--
 1 file changed, 2 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/341e0e77/docs/streaming-programming-guide.md
--
diff --git a/docs/streaming-programming-guide.md 
b/docs/streaming-programming-guide.md
index df94e95..82d3647 100644
--- a/docs/streaming-programming-guide.md
+++ b/docs/streaming-programming-guide.md
@@ -656,7 +656,7 @@ methods for creating DStreams from files as input sources.
Python API 
`fileStream` is not available in the Python API, only  `textFileStream` is  
   available.
 
 - **Streams based on Custom Receivers:** DStreams can be created with data 
streams received through custom receivers. See the [Custom Receiver
-  Guide](streaming-custom-receivers.html) and [DStream 
Akka](https://github.com/spark-packages/dstream-akka) for more details.
+  Guide](streaming-custom-receivers.html) for more details.
 
 - **Queue of RDDs as a Stream:** For testing a Spark Streaming application 
with test data, one can also create a DStream based on a queue of RDDs, using 
`streamingContext.queueStream(queueOfRDDs)`. Each RDD pushed into the queue 
will be treated as a batch of data in the DStream, and processed like a stream.
 
@@ -2383,11 +2383,7 @@ additional effort may be necessary to achieve 
exactly-once semantics. There are
 - [Kafka Integration Guide](streaming-kafka-integration.html)
 - [Kinesis Integration Guide](streaming-kinesis-integration.html)
 - [Custom Receiver Guide](streaming-custom-receivers.html)
-* External DStream data sources:
-- [DStream MQTT](https://github.com/spark-packages/dstream-mqtt)
-- [DStream Twitter](https://github.com/spark-packages/dstream-twitter)
-- [DStream Akka](https://github.com/spark-packages/dstream-akka)
-- [DStream ZeroMQ](https://github.com/spark-packages/dstream-zeromq)
+* Third-party DStream data sources can be found in [Spark 
Packages](https://spark-packages.org/)
 * API documentation
   - Scala docs
 * 
[StreamingContext](api/scala/index.html#org.apache.spark.streaming.StreamingContext)
 and


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org