How to collect data for some particular point in spark streaming

2016-03-21 Thread Nagu Kothapalli
Hi,


I Want to collect data from kafka ( json Data , Ordered )   to particular
time stamp . is there any way to  do with spark streaming ?

Please let me know.


java.lang.IllegalArgumentException: Unable to create serializer "com.esotericsoftware.kryo.serializers.FieldSerializer"

2016-03-14 Thread Nagu Kothapalli
Hi Team,

I am geeting below exceptions , while running the spark java streaming job
with custome reciver.

org.apache.spark.SparkException: Job aborted due to stage failure: Failed
to serialize task 508, not attempting to retry it. Exception during
serialization: java.io.IOException: java.lang.IllegalArgumentException:
Unable to create serializer
"com.esotericsoftware.kryo.serializers.FieldSerializer" for class:
org.test.CustomeReciver
at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1294)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1282)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1281)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1281)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1507)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1469)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

Please help me if u have any idea?


Re: Spark Streaming - Custom ReceiverInputDStream ( Custom Source) In java

2016-01-22 Thread Nagu Kothapalli
Hi

Anyone have any idea on *ClassTag in spark context..*

On Fri, Jan 22, 2016 at 12:42 PM, Nagu Kothapalli <nagukothapal...@gmail.com
> wrote:

> Hi All
>
> Facing an Issuee With CustomInputDStream object in java
>
>
>
> *public CustomInputDStream(StreamingContext ssc_, ClassTag classTag)*
> * {*
> * super(ssc_, classTag);*
> * }*
> Can you help me to create the Instance in above class with *ClassTag* In
> java
>