I am trying the spark streaming and listening to a socket, I am using the
rawSocketStream method to create a receiver and a DStream. But when I print
the DStream I get the below exception.*Code to create a
DStream:*JavaSparkContext jsc = new JavaSparkContext("Master",
"app");JavaStreamingContext jssc = new JavaStreamingContext(jsc, new
Seconds(3));JavaReceiverInputDStream rawStream =
jssc.rawSocketStream("localhost", 9999);log.info(tracePrefix + "Created the
stream ...");rawStream.print();jssc.start();jssc.awaitTermination();*Code to
send a protobug object over TCP connection:*FileInputStream input = new
FileInputStream("address_book");AddressBook book =
AddressBookProtos.AddressBook.parseFrom(input);log.info(tracePrefix + "Size
of contacts: " + book.getPersonList().size());ServerSocket serverSocket =
new ServerSocket(9999);log.info(tracePrefix + "Waiting for connections
...");Socket s1 = serverSocket.accept();log.info(tracePrefix + "Accepted a
connection ...");while(true) {    Thread.sleep(3000);    ObjectOutputStream
out = new ObjectOutputStream(s1.getOutputStream());   
out.writeByte(book.getSerializedSize());    out.write(book.toByteArray());   
out.flush();    log.info(tracePrefix + "Written to new socket");}*Stacktrace
is shown below:*java.lang.IllegalArgumentException    at
java.nio.ByteBuffer.allocate(ByteBuffer.java:334)    at
org.apache.spark.streaming.dstream.RawNetworkReceiver.onStart(RawInputDStream.scala:88)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:148)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:130)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:575)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:565)
   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)    at
org.apache.spark.scheduler.Task.run(Task.scala:89)    at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
  
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
  
at java.lang.Thread.run(Thread.java:745)2016-04-02 07:45:47,607 ERROR
[Executor task launch worker-0]
org.apache.spark.streaming.receiver.ReceiverSupervisorImplStopped receiver
with error: java.lang.IllegalArgumentException2016-04-02 07:45:47,613 ERROR
[Executor task launch worker-0] org.apache.spark.executor.ExecutorException
in task 0.0 in stage 0.0 (TID 0)java.lang.IllegalArgumentException    at
java.nio.ByteBuffer.allocate(ByteBuffer.java:334)    at
org.apache.spark.streaming.dstream.RawNetworkReceiver.onStart(RawInputDStream.scala:88)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:148)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:130)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:575)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:565)
   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)    at
org.apache.spark.scheduler.Task.run(Task.scala:89)    at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
  
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
  
at java.lang.Thread.run(Thread.java:745)2016-04-02 07:45:47,646 ERROR
[task-result-getter-0] org.apache.spark.scheduler.TaskSetManagerTask 0 in
stage 0.0 failed 1 times; aborting job2016-04-02 07:45:47,656 ERROR
[submit-job-thread-pool-0]
org.apache.spark.streaming.scheduler.ReceiverTrackerReceiver has been
stopped. Try to restart it.org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 0.0 (TID 0, localhost):
java.lang.IllegalArgumentException    at
java.nio.ByteBuffer.allocate(ByteBuffer.java:334)    at
org.apache.spark.streaming.dstream.RawNetworkReceiver.onStart(RawInputDStream.scala:88)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:148)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:130)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:575)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:565)
   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)    at
org.apache.spark.scheduler.Task.run(Task.scala:89)    at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
  
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
  
at java.lang.Thread.run(Thread.java:745)Driver stacktrace:    at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
   
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
   
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
   
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)  
 
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)    at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)   
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
   
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
   
at scala.Option.foreach(Option.scala:236)    at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
   
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
   
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
   
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
   
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)Caused by:
java.lang.IllegalArgumentException    at
java.nio.ByteBuffer.allocate(ByteBuffer.java:334)    at
org.apache.spark.streaming.dstream.RawNetworkReceiver.onStart(RawInputDStream.scala:88)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:148)
   
at
org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:130)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:575)
   
at
org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:565)
   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.SparkContext$$anonfun$37.apply(SparkContext.scala:1992)   
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)    at
org.apache.spark.scheduler.Task.run(Task.scala:89)    at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
  
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
  
at java.lang.Thread.run(Thread.java:745)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-streaming-rawSocketStream-with-protobuf-tp26662.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to