I am also facing the same issue, anyone figured it? Please help
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-mode-connection-failure-from-worker-node-to-master-tp23101p23816.html
Sent from the Apache Spark User List mailing list archive at Nabb
same issue anyone help please
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-IllegalStateException-unread-block-data-tp20668p20745.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
No able to get it , how did you exactly fix it? i am using maven build
i downloaded spark1.1.1 and then packaged with mvn -Dhadoop.version=1.2.1
-DskipTests clean package but i keep getting invalid class exceptions
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabbl
Dear All,
I am using spark streaming, It was working fine when i was using spark1.0.2,
now i repeatedly getting few issue
Like class not found, i am using the same pom.xml with the updated version
for all spark modules
i am using spark-core,streaming, streaming with kafka modules..
Its constant
I am trying to submit spark streaming program, when i submit batch process
its working.. but when i do the same with spark streaming.. it throws Anyone
please help
14/11/26 17:42:25 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:50016
14/11/26 17:42:25 INFO server.Server: jetty-8.1
Any one, any luck?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-foreachRDD-network-output-tp15205p18251.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Thanks boss its working :)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p18250.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
Hi Thanks for replying,
I have posted my code in
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641
hi TD,
I would like to run streaming 24/7 and trying to use get or create but its
not working please can you help on this
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabb
my goal is to run streaming 24x 7
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NullPointerException-on-reading-checkpoint-files-tp7306p18168.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
Hi TD,
I am trying to use getorCreate but i am getting java.io not serialised
please help i have posted it in a different thread.
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-getOrCreate-tc18060.html
--
View this message in context:
http://apache-spark-user-list.100156
Anybody any luck? I am also trying to set NONE to delete key from state, will
null help? how to use scala none in java
My code goes this way
public static class ScalaLang {
public static Option none() {
return (Option) None$.MODULE$;
}
}
Hi All
I am using SparkStreaming..
public class SparkStreaming{
SparkConf sparkConf = new SparkConf().setAppName("Sales");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new
Duration(5000));
String chkPntDir = ""; //get checkpoint dir
jssc.checkpoint(chkPntDir);
JavaSpark jSpark
Same Issue .. How did you solve it?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-spark-SparkEnv-tp10641p18047.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
I tried running it but dint work
public static final SparkConf batchConf= new SparkConf();
String master = "spark://sivarani:7077";
String spark_home ="/home/sivarani/spark-1.0.2-bin-hadoop2/";
String jar = "/home/sivarani/build/Test.jar";
public static final Java
The problem is simple
I want a to stream data 24/7 do some calculations and save the result in a
csv/json file so that i could use it for visualization using dc.js/d3.js
I opted for spark streaming on yarn cluster with kafka tried running it for
24/7
Using GroupByKey and updateStateByKey to have
Hi All
I am using spark streaming with kafka streaming for 24/7
My Code is something like
JavaDStream data = messages.map(new MapData());
JavaPairDStream> records = data.mapToPair(new
dataPair()).groupByKey(100);
records.print();
JavaPairDStream result = records.mapValues(new
Sum()).updateState
Hi
I know we can create spark context with new JavaStreamingContext(master,
appName, batchDuration, sparkHome, jarFile)
but to run the application we will have to use
spark-home/spark-submit --class NetworkCount
i want skip submitting manually, i wanted to invoke this spark app when a
conditio
I am having the same issue, i am using update stateBykey and over a period a
set of data will not change i would like save it and delete it from state..
have you found the answer? please share your views. Thanks for your time
--
View this message in context:
http://apache-spark-user-list.100156
Hi tdas, is it possible to run spark 24/7, i am using updateStateByKey and i
am streaming 3lac records in 1/2 hr, i am not getting the correct result
also i am not not able to run spark streaming for 24/7 after hew hrs i get
array out of bound exception even if i am not streaming anything? btw will
Hi,
i am submitting spark application in the following fashion
bin/spark-submit --class "NetworkCount" --master spark://abc.test.com:7077
try/simple-project/target/simple-project-1.0-jar-with-dependencies.jar
But is there any other way to submit spark application through the code?
like for ex
I am new to spark, i am using Spark streaming with Kafka..
My streaming duration is 1s..
Assume i get 100 records in 1s and 120 records in 2s and 80 records in 3s
--> {sec 1 1,2,...100} --> {sec 2 1,2..120} --> {sec 3 1,2,..80}
I apply my logic in sec 1 and have a result => result1
i want to
22 matches
Mail list logo