Re: Standalone mode connection failure from worker node to master

2015-07-14 Thread sivarani
I am also facing the same issue, anyone figured it? Please help -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-mode-connection-failure-from-worker-node-to-master-tp23101p23816.html Sent from the Apache Spark User List mailing list archive at

Re: java.lang.IllegalStateException: unread block data

2014-12-17 Thread sivarani
same issue anyone help please -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20745.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

spark 1.1.1 Maven dependency

2014-12-09 Thread sivarani
Dear All, I am using spark streaming, It was working fine when i was using spark1.0.2, now i repeatedly getting few issue Like class not found, i am using the same pom.xml with the updated version for all spark modules i am using spark-core,streaming, streaming with kafka modules.. Its

Re: Submiting Spark application through code

2014-11-26 Thread sivarani
I am trying to submit spark streaming program, when i submit batch process its working.. but when i do the same with spark streaming.. it throws Anyone please help 14/11/26 17:42:25 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:50016 14/11/26 17:42:25 INFO server.Server:

Re: Streaming window operations not producing output

2014-11-05 Thread sivarani
hi TD, I would like to run streaming 24/7 and trying to use get or create but its not working please can you help on this http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html -- View this message in context:

Re: java.io.NotSerializableException: org.apache.spark.SparkEnv

2014-11-05 Thread sivarani
Hi Thanks for replying, I have posted my code in http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html -- View this message in context:

Re: Submiting Spark application through code

2014-11-05 Thread sivarani
Thanks boss its working :) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p18250.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Spark Streaming: foreachRDD network output

2014-11-05 Thread sivarani
Any one, any luck? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-foreachRDD-network-output-tp15205p18251.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: java.io.NotSerializableException: org.apache.spark.SparkEnv

2014-11-04 Thread sivarani
Same Issue .. How did you solve it? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641p18047.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Spark Streaming getOrCreate

2014-11-04 Thread sivarani
Hi All I am using SparkStreaming.. public class SparkStreaming{ SparkConf sparkConf = new SparkConf().setAppName(Sales); JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(5000)); String chkPntDir = ; //get checkpoint dir jssc.checkpoint(chkPntDir); JavaSpark jSpark =

Re: Spark Streaming getOrCreate

2014-11-04 Thread sivarani
Anybody any luck? I am also trying to set NONE to delete key from state, will null help? how to use scala none in java My code goes this way public static class ScalaLang { public static T OptionT none() { return (OptionT) None$.MODULE$; }

Re: Submiting Spark application through code

2014-10-31 Thread sivarani
I tried running it but dint work public static final SparkConf batchConf= new SparkConf(); String master = spark://sivarani:7077; String spark_home =/home/sivarani/spark-1.0.2-bin-hadoop2/; String jar = /home/sivarani/build/Test.jar; public static final JavaSparkContext batchSparkContext = new

Spark Streaming Issue not running 24/7

2014-10-30 Thread sivarani
The problem is simple I want a to stream data 24/7 do some calculations and save the result in a csv/json file so that i could use it for visualization using dc.js/d3.js I opted for spark streaming on yarn cluster with kafka tried running it for 24/7 Using GroupByKey and updateStateByKey to

Streaming Question regarding lazy calculations

2014-10-29 Thread sivarani
Hi All I am using spark streaming with kafka streaming for 24/7 My Code is something like JavaDStreamString data = messages.map(new MapData()); JavaPairDStreamString, Iterablelt;String records = data.mapToPair(new dataPair()).groupByKey(100); records.print(); JavaPairDStreamString, Double

Submiting Spark application through code

2014-10-28 Thread sivarani
Hi, i am submitting spark application in the following fashion bin/spark-submit --class NetworkCount --master spark://abc.test.com:7077 try/simple-project/target/simple-project-1.0-jar-with-dependencies.jar But is there any other way to submit spark application through the code? like for

Re: Spark Streaming Applications

2014-10-28 Thread sivarani
Hi tdas, is it possible to run spark 24/7, i am using updateStateByKey and i am streaming 3lac records in 1/2 hr, i am not getting the correct result also i am not not able to run spark streaming for 24/7 after hew hrs i get array out of bound exception even if i am not streaming anything? btw

Re: Spark Streaming - How to remove state for key

2014-10-28 Thread sivarani
I am having the same issue, i am using update stateBykey and over a period a set of data will not change i would like save it and delete it from state.. have you found the answer? please share your views. Thanks for your time -- View this message in context:

Re: Submiting Spark application through code

2014-10-28 Thread sivarani
Hi I know we can create spark context with new JavaStreamingContext(master, appName, batchDuration, sparkHome, jarFile) but to run the application we will have to use spark-home/spark-submit --class NetworkCount i want skip submitting manually, i wanted to invoke this spark app when a

Re: checkpoint and not running out of disk space

2014-10-20 Thread sivarani
I am new to spark, i am using Spark streaming with Kafka.. My streaming duration is 1s.. Assume i get 100 records in 1s and 120 records in 2s and 80 records in 3s -- {sec 1 1,2,...100} -- {sec 2 1,2..120} -- {sec 3 1,2,..80} I apply my logic in sec 1 and have a result = result1 i want to use