Hi Xi Shen,
You could set the spark.executor.memory in the code itself . new
SparkConf()..set(spark.executor.memory, 2g)
Or you can try the -- spark.executor.memory 2g while submitting the jar.
Regards
Jishnu Prathap
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Monday, March 16,
import com.google.gson.{GsonBuilder, JsonParser}
import org.apache.spark.mllib.clustering.KMeans
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.clustering.KMeans
/**
* Examine the collected tweets and trains a model based on
Hi,
If your message is string you will have to Change Encoder and
Decoder to StringEncoder , StringDecoder.
If your message Is byte[] you can use DefaultEncoder Decoder.
Also Don’t forget to add import statements depending on ur encoder and decoder.
import
Hi Akhil
Thanks for the response
Our use case is Object detection in multiple videos. It’s kind of searching
an image if present in the video by matching the image with all the frames of
the video. I am able to do it in normal java code using OpenCV lib now but I
don’t think it is scalable to
Hi
I am getting Stack overflow Error
Exception in main java.lang.stackoverflowerror
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at
Hi
I am getting Stack overflow Error
Exception in main java.lang.stackoverflowerror
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at
Hi
Thank you ☺Akhil it worked like charm…..
I used the file writer outside rdd.foreach that might be the reason for
nonserialisable exception….
Thanks Regards
Jishnu Menath Prathap
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Friday, November 21, 2014 1:15 PM
To: Jishnu Menath
Hi Akhil
Thanks for reply
But it creates different directories ..I tried using filewriter but it shows
non serializable error..
val stream = TwitterUtils.createStream(ssc, None) //, filters)
val statuses = stream.map(
status = sentimentAnalyzer.findSentiment({
Hi
I am trying to run a basic twitter stream program but getting blank
output. Please correct me if I am missing something.
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.twitter.TwitterUtils
import
Hi
I am getting the following error while running the
TwitterPopularTags example .I am using spark-1.1.0-bin-hadoop2.4 .
jishnu@getafix:~/spark/bin$ run-example TwitterPopularTags *** ** ** *** **
spark assembly has been built with Hive, including Datanucleus jars on classpath
Hi
Thanks Akhil you saved the day…. Its working perfectly …
Regards
Jishnu Menath Prathap
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Thursday, November 13, 2014 3:25 PM
To: Jishnu Menath Prathap (WT01 - BAS)
Cc: Akhil [via Apache Spark User List];
Hi ,
I am getting this weird error while starting Worker.
-bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker
spark://osebi-UServer:59468
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/09/24 16:22:04 INFO worker.Worker: Registered signal
Hi ,
I am getting this weird error while starting Worker.
-bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker
spark://osebi-UServer:59468
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/09/24 16:22:04 INFO worker.Worker: Registered signal
Hi ,
I am getting this weird error while starting Worker.
-bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker
spark://osebi-UServer:59468
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/09/24 16:22:04 INFO worker.Worker: Registered signal
Hi ,
I am getting this weird error while starting Worker.
-bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker
spark://osebi-UServer:59468
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/09/24 16:22:04 INFO worker.Worker: Registered signal
No .. I am not passing any argument.
I am getting this error while starting the Master
The same spark binary i am able to run in another machine ( ubuntu )
installed.
The information contained in this electronic message and any attachments to
this message are intended for the
Hi
Sorry for the repeated mails .My post was not accepted by the mailing list due
to some problem in postmas...@wipro.com I had to manually send it . Still it
was not visible for half an hour.I retried. But later all the post was visible.
I deleted it from the page but it was already
17 matches
Mail list logo