I’m not familiar with you COAP library but onStart is called only once. You’re 
only reading the value once when the custom receiver is initialized.

You need to set-up a callback, poll a buffer — again, depends on your COAP 
client — In short configure your client to “start listening for changes”
Then you need to call .store() for every new value that you’re notified of.

-adrian



On 10/16/15, 9:38 AM, "Sadaf" <sa...@platalytics.com> wrote:

>I am currently working on IOT Coap protocol.I accessed server on local host
>through copper firefox plugin. Then i Added resouce having "GET"
>functionality in server. After that i made its client as a streaming source.
>Here is the code of client streaming
>
> class customReceiver(test:String) extends 
>Receiver[String](StorageLevel.MEMORY_AND_DISK_2) with Logging with
>Serializable { 
>   @volatile private var stopped = false
>   override def onStart() {
>
>      val client = new CoapClient("ip/resource")
>      var text = client.get().getResponseText();  
>      store(text)
>   }
>   override def onStop(): Unit = synchronized { 
>      try
>      {
>         stopped = true
>      }
>      catch
>      {
>         case e: Exception => println("exception caught: " + e);
>      }
>   }
> }
>but i am facing a problem. During streaming it just read a resource once.
>after that it fetches all empty rdd and completes its batches. Meanwhile if
>resource changes its value it doesn't read that. are i doing something
>wrong? or is there exists any other functionality to read whenever resource
>get changed that i can handle in my Custom receiver.? or any idea about how
>to GET value continuously during streaming?
>
>Any help is much awaited and appreciated. Thanks
>
>
>
>--
>View this message in context: 
>http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-of-COAP-Resources-tp25084.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to