Ah, good catch Aaron - my bad!

On Wed, Feb 12, 2014 at 2:50 PM, Aaron Davidson <ilike...@gmail.com> wrote:

> Be sure to link against the 0.9.0 version, though! Here are the updated
> docs:
>
> https://spark.incubator.apache.org/docs/0.9.0/streaming-programming-guide.html#basics
>
>
>
> On Wed, Feb 12, 2014 at 2:48 PM, Evan R. Sparks <evan.spa...@gmail.com>wrote:
>
>> You'll need to link against the spark-streaming artifact as well - see
>> the linking section here:
>>
>> https://spark.incubator.apache.org/docs/0.8.1/streaming-programming-guide.html
>>
>>
>> On Wed, Feb 12, 2014 at 2:43 PM, Adrian Mocanu <amoc...@verticalscope.com
>> > wrote:
>>
>>>  I pull l the following version of spark via sbt
>>>
>>>     "org.apache.spark" % "spark-core_2.10" % "0.9.0-incubating"
>>>
>>> Which according to the green button at the bottom of this page
>>> http://spark.incubator.apache.org/streaming/ should have streaming.
>>>
>>>
>>>
>>>
>>>
>>> On this page:
>>> http://spark.incubator.apache.org/releases/spark-release-0-9-0.html
>>>
>>> It talks about improvements to spark:
>>>
>>> "A few aspects of the API have been improved:
>>>
>>>    - DStream and PairDStream classes have been moved from
>>>    org.apache.spark.streaming to org.apache.spark.streaming.dstream to
>>>    keep it consistent with org.apache.spark.rdd.RDD.
>>>    - DStream.foreach has been renamed to foreachRDD to make it explicit
>>>    that it works for every RDD, not every element"
>>>
>>> *These package don't exist in spark anymore. *See my attachment. *Where
>>> did it go?* I also can't find
>>> *org.apache.spark.streaming.StreamingContext*
>>>
>>>
>>>
>>>
>>>
>>> -Adrian
>>>
>>>
>>>
>>
>>
>

Reply via email to