Thanks Sean, but I'm importing org.apache.spark.streaming.
StreamingContext._

Here are the spark imports:

import org.apache.spark.streaming._

import org.apache.spark.streaming.StreamingContext._

import org.apache.spark.streaming.kafka._

import org.apache.spark.SparkConf

....

    val stream = KafkaUtils.createStream(ssc, zkQuorum, group,
topicpMap).map(_._2)             stream.saveAsNewAPIHadoopFile
(destination, classOf[Void], classOf[Group], classOf[ExampleOutputFormat],
conf)

....

Anything else I might be missing?



On Thu, Oct 9, 2014 at 1:14 PM, Sean Owen <so...@cloudera.com> wrote:

> I think you have not imported
> org.apache.spark.streaming.StreamingContext._ ? This gets you the
> implicits that provide these methods.
>
> On Thu, Oct 9, 2014 at 8:40 PM, bdev <buntu...@gmail.com> wrote:
> > I'm using KafkaUtils.createStream for the input stream to pull messages
> from
> > kafka which seems to return a ReceiverInputDStream. I do not see
> > saveAsNewAPIHadoopFile available on ReceiverInputDStream and obviously
> run
> > into this error:
> >
> >  saveAsNewAPIHadoopFile is not a member of
> > org.apache.spark.streaming.dstream.DStream[String]
> >
> > Any help on how to go about saving a DStream to Hadoop would be
> appreciated.
> >
> > Thanks!
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-save-ReceiverInputDStream-to-Hadoop-using-saveAsNewAPIHadoopFile-tp16062.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>

Reply via email to