RE: How to set Spark executor memory?

2015-03-16 Thread jishnu.prathap
Hi Xi Shen, You could set the spark.executor.memory in the code itself . new SparkConf()..set(spark.executor.memory, 2g) Or you can try the -- spark.executor.memory 2g while submitting the jar. Regards Jishnu Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Monday, March 16,

RE: Spark SQL Stackoverflow error

2015-03-10 Thread jishnu.prathap
import com.google.gson.{GsonBuilder, JsonParser} import org.apache.spark.mllib.clustering.KMeans import org.apache.spark.sql.SQLContext import org.apache.spark.{SparkConf, SparkContext} import org.apache.spark.mllib.clustering.KMeans /** * Examine the collected tweets and trains a model based on

RE: Error KafkaStream

2015-02-05 Thread jishnu.prathap
Hi, If your message is string you will have to Change Encoder and Decoder to StringEncoder , StringDecoder. If your message Is byte[] you can use DefaultEncoder Decoder. Also Don’t forget to add import statements depending on ur encoder and decoder. import

RE: How to integrate Spark with OpenCV?

2015-01-14 Thread jishnu.prathap
Hi Akhil Thanks for the response Our use case is Object detection in multiple videos. It’s kind of searching an image if present in the video by matching the image with all the frames of the video. I am able to do it in normal java code using OpenCV lib now but I don’t think it is scalable to

Stack overflow Error while executing spark SQL

2014-12-09 Thread jishnu.prathap
Hi I am getting Stack overflow Error Exception in main java.lang.stackoverflowerror scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254) at

Stack overflow Error while executing spark SQL

2014-12-09 Thread jishnu.prathap
Hi I am getting Stack overflow Error Exception in main java.lang.stackoverflowerror scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254) at

RE: Persist streams to text files

2014-11-21 Thread jishnu.prathap
Hi Thank you ☺Akhil it worked like charm….. I used the file writer outside rdd.foreach that might be the reason for nonserialisable exception…. Thanks Regards Jishnu Menath Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Friday, November 21, 2014 1:15 PM To: Jishnu Menath

RE: Persist streams to text files

2014-11-20 Thread jishnu.prathap
Hi Akhil Thanks for reply But it creates different directories ..I tried using filewriter but it shows non serializable error.. val stream = TwitterUtils.createStream(ssc, None) //, filters) val statuses = stream.map( status = sentimentAnalyzer.findSentiment({

basic twitter stream program not working.

2014-11-13 Thread jishnu.prathap
Hi I am trying to run a basic twitter stream program but getting blank output. Please correct me if I am missing something. import org.apache.spark.SparkConf import org.apache.spark.streaming.StreamingContext import org.apache.spark.streaming.twitter.TwitterUtils import

runexample TwitterPopularTags showing Class Not found error

2014-11-13 Thread jishnu.prathap
Hi I am getting the following error while running the TwitterPopularTags example .I am using spark-1.1.0-bin-hadoop2.4 . jishnu@getafix:~/spark/bin$ run-example TwitterPopularTags *** ** ** *** ** spark assembly has been built with Hive, including Datanucleus jars on classpath

RE: basic twitter stream program not working.

2014-11-13 Thread jishnu.prathap
Hi Thanks Akhil you saved the day…. Its working perfectly … Regards Jishnu Menath Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Thursday, November 13, 2014 3:25 PM To: Jishnu Menath Prathap (WT01 - BAS) Cc: Akhil [via Apache Spark User List];

Re: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal

RE: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal

java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal

java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal

Re: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
No .. I am not passing any argument. I am getting this error while starting the Master The same spark binary i am able to run in another machine ( ubuntu ) installed. The information contained in this electronic message and any attachments to this message are intended for the

RE: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi Sorry for the repeated mails .My post was not accepted by the mailing list due to some problem in postmas...@wipro.com I had to manually send it . Still it was not visible for half an hour.I retried. But later all the post was visible. I deleted it from the page but it was already