Thanks  for your  suggestion.
Yes by Dstream.SaveAsTextFile();
I was doing a mistake by writing StorageLevel.NULL while overriding the 
storageLevel method in my custom receiver.
When I changed it to StorageLevel.MEMORY_AND_DISK_2() , data started to save at 
disk.
Now it’s running without any issue.


From: Tathagata Das [mailto:t...@databricks.com]
Sent: Friday, May 29, 2015 3:30 AM
To: Chaudhary, Umesh
Cc: Arush Kharbanda; user@spark.apache.org
Subject: Re: FW: Websphere MQ as a data source for Apache Spark Streaming

Are you sure that the data can be saved as strings?
Another, more controlled approach is use DStream.foreachRDD , which takes a 
Function2 parameter, RDD and Time. There you can explicitly do stuff with the 
RDD, save it to separate files (separated by time), or whatever.  Might help 
you to debug what is going on.
Might help if you shows the streaming program in a pastebin.

TD


On Fri, May 29, 2015 at 12:55 AM, Chaudhary, Umesh 
<umesh.chaudh...@searshc.com<mailto:umesh.chaudh...@searshc.com>> wrote:
Hi,
I have written  manual receiver for Websphere MQ and its working fine.
If I am doing JavaDStream.SaveAsTextFile(“/home/user/out.txt”)  then its 
generating a directory naming out.txt appending its timestamp.
In this directory only _SUCCESS file is present. I can see data on console 
while running in local mode but not able to save it as text file.
Is there any other way for saving streaming data?

From: Chaudhary, Umesh
Sent: Tuesday, May 26, 2015 2:39 AM
To: 'Arush Kharbanda'; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: Websphere MQ as a data source for Apache Spark Streaming

Thanks for the suggestion, I will try and post the outcome.

From: Arush Kharbanda [mailto:ar...@sigmoidanalytics.com]
Sent: Monday, May 25, 2015 12:24 PM
To: Chaudhary, Umesh; user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Websphere MQ as a data source for Apache Spark Streaming

Hi Umesh,

You can write a customer receiver for Websphere MQ, using the API for websphere 
MQ.

https://spark.apache.org/docs/latest/streaming-custom-receivers.html

Thanks
Arush

On Mon, May 25, 2015 at 8:04 PM, Chaudhary, Umesh 
<umesh.chaudh...@searshc.com<mailto:umesh.chaudh...@searshc.com>> wrote:
I have seen it but it has different configuration for connecting the MQ.
I mean for Websphere MQ we need Host, Queue Manager, Channel And Queue Name but 
here according to MQTT protocol

client = new MqttClient(brokerUrl, MqttClient.generateClientId(), persistence)

It only expects Broker URL which is in appropriate for establishing connection 
with Websphere MQ.

Please Suggest !


From: Arush Kharbanda 
[mailto:ar...@sigmoidanalytics.com<mailto:ar...@sigmoidanalytics.com>]
Sent: Monday, May 25, 2015 6:29 AM
To: Chaudhary, Umesh
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Websphere MQ as a data source for Apache Spark Streaming

Hi Umesh,

You can connect to Spark Streaming with MQTT  refer to the example.

https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/MQTTWordCount.scala



Thanks
Arush



On Mon, May 25, 2015 at 3:43 PM, umesh9794 
<umesh.chaudh...@searshc.com<mailto:umesh.chaudh...@searshc.com>> wrote:
I was digging into the possibilities for Websphere MQ as a data source for
spark-streaming becuase it is needed in one of our use case. I got to know
that  MQTT <http://mqtt.org/>   is the protocol that supports the
communication from MQ data structures but since I am a newbie to spark
streaming I need some working examples for the same. Did anyone try to
connect the MQ with spark streaming. Please devise the best way for doing
so.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Websphere-MQ-as-a-data-source-for-Apache-Spark-Streaming-tp23013.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



--

[Image removed by sender. Sigmoid 
Analytics]<http://htmlsig.com/www.sigmoidanalytics.com>

Arush Kharbanda || Technical Teamlead

ar...@sigmoidanalytics.com<mailto:ar...@sigmoidanalytics.com> || 
www.sigmoidanalytics.com<http://www.sigmoidanalytics.com/>
This message, including any attachments, is the property of Sears Holdings 
Corporation and/or one of its subsidiaries. It is confidential and may contain 
proprietary or legally privileged information. If you are not the intended 
recipient, please delete it without reading the contents. Thank you.



--

[Image removed by sender. Sigmoid 
Analytics]<http://htmlsig.com/www.sigmoidanalytics.com>

Arush Kharbanda || Technical Teamlead

ar...@sigmoidanalytics.com<mailto:ar...@sigmoidanalytics.com> || 
www.sigmoidanalytics.com<http://www.sigmoidanalytics.com/>
This message, including any attachments, is the property of Sears Holdings 
Corporation and/or one of its subsidiaries. It is confidential and may contain 
proprietary or legally privileged information. If you are not the intended 
recipient, please delete it without reading the contents. Thank you.


This message, including any attachments, is the property of Sears Holdings 
Corporation and/or one of its subsidiaries. It is confidential and may contain 
proprietary or legally privileged information. If you are not the intended 
recipient, please delete it without reading the contents. Thank you.

Reply via email to