I am also facing the same issue, anyone figured it? Please help
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-mode-connection-failure-from-worker-node-to-master-tp23101p23816.html
Sent from the Apache Spark User List mailing list archive at
same issue anyone help please
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20745.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Dear All,
I am using spark streaming, It was working fine when i was using spark1.0.2,
now i repeatedly getting few issue
Like class not found, i am using the same pom.xml with the updated version
for all spark modules
i am using spark-core,streaming, streaming with kafka modules..
Its
I am trying to submit spark streaming program, when i submit batch process
its working.. but when i do the same with spark streaming.. it throws Anyone
please help
14/11/26 17:42:25 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:50016
14/11/26 17:42:25 INFO server.Server:
hi TD,
I would like to run streaming 24/7 and trying to use get or create but its
not working please can you help on this
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html
--
View this message in context:
Hi Thanks for replying,
I have posted my code in
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html
--
View this message in context:
Thanks boss its working :)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p18250.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Any one, any luck?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-foreachRDD-network-output-tp15205p18251.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Same Issue .. How did you solve it?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641p18047.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Hi All
I am using SparkStreaming..
public class SparkStreaming{
SparkConf sparkConf = new SparkConf().setAppName(Sales);
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new
Duration(5000));
String chkPntDir = ; //get checkpoint dir
jssc.checkpoint(chkPntDir);
JavaSpark jSpark =
Anybody any luck? I am also trying to set NONE to delete key from state, will
null help? how to use scala none in java
My code goes this way
public static class ScalaLang {
public static T OptionT none() {
return (OptionT) None$.MODULE$;
}
I tried running it but dint work
public static final SparkConf batchConf= new SparkConf();
String master = spark://sivarani:7077;
String spark_home =/home/sivarani/spark-1.0.2-bin-hadoop2/;
String jar = /home/sivarani/build/Test.jar;
public static final JavaSparkContext batchSparkContext = new
The problem is simple
I want a to stream data 24/7 do some calculations and save the result in a
csv/json file so that i could use it for visualization using dc.js/d3.js
I opted for spark streaming on yarn cluster with kafka tried running it for
24/7
Using GroupByKey and updateStateByKey to
Hi All
I am using spark streaming with kafka streaming for 24/7
My Code is something like
JavaDStreamString data = messages.map(new MapData());
JavaPairDStreamString, Iterablelt;String records = data.mapToPair(new
dataPair()).groupByKey(100);
records.print();
JavaPairDStreamString, Double
Hi,
i am submitting spark application in the following fashion
bin/spark-submit --class NetworkCount --master spark://abc.test.com:7077
try/simple-project/target/simple-project-1.0-jar-with-dependencies.jar
But is there any other way to submit spark application through the code?
like for
Hi tdas, is it possible to run spark 24/7, i am using updateStateByKey and i
am streaming 3lac records in 1/2 hr, i am not getting the correct result
also i am not not able to run spark streaming for 24/7 after hew hrs i get
array out of bound exception even if i am not streaming anything? btw
I am having the same issue, i am using update stateBykey and over a period a
set of data will not change i would like save it and delete it from state..
have you found the answer? please share your views. Thanks for your time
--
View this message in context:
Hi
I know we can create spark context with new JavaStreamingContext(master,
appName, batchDuration, sparkHome, jarFile)
but to run the application we will have to use
spark-home/spark-submit --class NetworkCount
i want skip submitting manually, i wanted to invoke this spark app when a
I am new to spark, i am using Spark streaming with Kafka..
My streaming duration is 1s..
Assume i get 100 records in 1s and 120 records in 2s and 80 records in 3s
-- {sec 1 1,2,...100} -- {sec 2 1,2..120} -- {sec 3 1,2,..80}
I apply my logic in sec 1 and have a result = result1
i want to use
19 matches
Mail list logo