Re: Standalone mode connection failure from worker node to master

2015-07-14 Thread sivarani
I am also facing the same issue, anyone figured it? Please help -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-mode-connection-failure-from-worker-node-to-master-tp23101p23816.html Sent from the Apache Spark User List mailing list archive at Nabb

Re: java.lang.IllegalStateException: unread block data

2014-12-17 Thread sivarani
same issue anyone help please -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20745.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

Re: Standalone spark cluster. Can't submit job programmatically -> java.io.InvalidClassException

2014-12-11 Thread sivarani
No able to get it , how did you exactly fix it? i am using maven build i downloaded spark1.1.1 and then packaged with mvn -Dhadoop.version=1.2.1 -DskipTests clean package but i keep getting invalid class exceptions -- View this message in context: http://apache-spark-user-list.1001560.n3.nabbl

spark 1.1.1 Maven dependency

2014-12-09 Thread sivarani
Dear All, I am using spark streaming, It was working fine when i was using spark1.0.2, now i repeatedly getting few issue Like class not found, i am using the same pom.xml with the updated version for all spark modules i am using spark-core,streaming, streaming with kafka modules.. Its constant

Re: Submiting Spark application through code

2014-11-26 Thread sivarani
I am trying to submit spark streaming program, when i submit batch process its working.. but when i do the same with spark streaming.. it throws Anyone please help 14/11/26 17:42:25 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:50016 14/11/26 17:42:25 INFO server.Server: jetty-8.1

Re: Spark Streaming: foreachRDD network output

2014-11-05 Thread sivarani
Any one, any luck? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-foreachRDD-network-output-tp15205p18251.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Submiting Spark application through code

2014-11-05 Thread sivarani
Thanks boss its working :) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p18250.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

Re: java.io.NotSerializableException: org.apache.spark.SparkEnv

2014-11-05 Thread sivarani
Hi Thanks for replying, I have posted my code in http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641

Re: Streaming window operations not producing output

2014-11-05 Thread sivarani
hi TD, I would like to run streaming 24/7 and trying to use get or create but its not working please can you help on this http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html -- View this message in context: http://apache-spark-user-list.1001560.n3.nabb

Re: NullPointerException on reading checkpoint files

2014-11-05 Thread sivarani
my goal is to run streaming 24x 7 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NullPointerException-on-reading-checkpoint-files-tp7306p18168.html Sent from the Apache Spark User List mailing list archive at Nabble.com. ---

Re: NullPointerException on reading checkpoint files

2014-11-05 Thread sivarani
Hi TD, I am trying to use getorCreate but i am getting java.io not serialised please help i have posted it in a different thread. http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html -- View this message in context: http://apache-spark-user-list.100156

Re: Spark Streaming getOrCreate

2014-11-04 Thread sivarani
Anybody any luck? I am also trying to set NONE to delete key from state, will null help? how to use scala none in java My code goes this way public static class ScalaLang { public static Option none() { return (Option) None$.MODULE$; } }

Spark Streaming getOrCreate

2014-11-04 Thread sivarani
Hi All I am using SparkStreaming.. public class SparkStreaming{ SparkConf sparkConf = new SparkConf().setAppName("Sales"); JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(5000)); String chkPntDir = ""; //get checkpoint dir jssc.checkpoint(chkPntDir); JavaSpark jSpark

Re: java.io.NotSerializableException: org.apache.spark.SparkEnv

2014-11-04 Thread sivarani
Same Issue .. How did you solve it? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641p18047.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --

Re: Submiting Spark application through code

2014-10-30 Thread sivarani
I tried running it but dint work public static final SparkConf batchConf= new SparkConf(); String master = "spark://sivarani:7077"; String spark_home ="/home/sivarani/spark-1.0.2-bin-hadoop2/"; String jar = "/home/sivarani/build/Test.jar"; public static final Java

Spark Streaming Issue not running 24/7

2014-10-30 Thread sivarani
The problem is simple I want a to stream data 24/7 do some calculations and save the result in a csv/json file so that i could use it for visualization using dc.js/d3.js I opted for spark streaming on yarn cluster with kafka tried running it for 24/7 Using GroupByKey and updateStateByKey to have

Streaming Question regarding lazy calculations

2014-10-29 Thread sivarani
Hi All I am using spark streaming with kafka streaming for 24/7 My Code is something like JavaDStream data = messages.map(new MapData()); JavaPairDStream> records = data.mapToPair(new dataPair()).groupByKey(100); records.print(); JavaPairDStream result = records.mapValues(new Sum()).updateState

Re: Submiting Spark application through code

2014-10-28 Thread sivarani
Hi I know we can create spark context with new JavaStreamingContext(master, appName, batchDuration, sparkHome, jarFile) but to run the application we will have to use spark-home/spark-submit --class NetworkCount i want skip submitting manually, i wanted to invoke this spark app when a conditio

Re: Spark Streaming - How to remove state for key

2014-10-28 Thread sivarani
I am having the same issue, i am using update stateBykey and over a period a set of data will not change i would like save it and delete it from state.. have you found the answer? please share your views. Thanks for your time -- View this message in context: http://apache-spark-user-list.100156

Re: Spark Streaming Applications

2014-10-28 Thread sivarani
Hi tdas, is it possible to run spark 24/7, i am using updateStateByKey and i am streaming 3lac records in 1/2 hr, i am not getting the correct result also i am not not able to run spark streaming for 24/7 after hew hrs i get array out of bound exception even if i am not streaming anything? btw will

Submiting Spark application through code

2014-10-28 Thread sivarani
Hi, i am submitting spark application in the following fashion bin/spark-submit --class "NetworkCount" --master spark://abc.test.com:7077 try/simple-project/target/simple-project-1.0-jar-with-dependencies.jar But is there any other way to submit spark application through the code? like for ex

Re: checkpoint and not running out of disk space

2014-10-19 Thread sivarani
I am new to spark, i am using Spark streaming with Kafka.. My streaming duration is 1s.. Assume i get 100 records in 1s and 120 records in 2s and 80 records in 3s --> {sec 1 1,2,...100} --> {sec 2 1,2..120} --> {sec 3 1,2,..80} I apply my logic in sec 1 and have a result => result1 i want to