Application kill from UI do not propagate exception

2017-03-27 Thread Noorul Islam K M
Hi all, I am trying to trap UI kill event of a spark application from driver. Some how the exception thrown is not propagated to the driver main program. See for example using spark-shell below. Is there a way to get hold of this event and shutdown the driver program? Regards, Noorul

This is a test mail, please ignore!

2017-03-27 Thread Noorul Islam K M
Sending plain text mail to test whether my mail appear in the list. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/This-is-a-test-mail-please-ignore-tp28538.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: spark jobserver

2017-03-05 Thread Noorul Islam K M
A better forum would be https://groups.google.com/forum/#!forum/spark-jobserver or https://gitter.im/spark-jobserver/spark-jobserver Regards, Noorul Madabhattula Rajesh Kumar writes: > Hi, > > I am getting below an exception when I start the job-server > >

Re: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

2017-03-03 Thread Noorul Islam K M
> When Initial jobs have not accepted any resources then what all can be > wrong? Going through stackoverflow and various blogs does not help. Maybe > need better logging for this? Adding dev > Did you take a look at the spark UI to see your resource availability? Thanks and Regards Noorul

Message loss in streaming even with graceful shutdown

2017-02-20 Thread Noorul Islam K M
Hi all, I have a streaming application with batch interval 10 seconds. val sparkConf = new SparkConf().setAppName("RMQWordCount") .set("spark.streaming.stopGracefullyOnShutdown", "true") val ssc = new StreamingContext(sparkConf, Seconds(10)) I also use reduceByKeyAndWindow() API

Re: installing spark-jobserver on cdh 5.7 and yarn

2016-11-09 Thread Noorul Islam K M
Reza zade writes: > Hi > > I have set up a cloudera cluster and work with spark. I want to install > spark-jobserver on it. What should I do? Maybe you should send this to spark-jobserver mailing list. https://github.com/spark-jobserver/spark-jobserver#contact Thanks and

Re: spark 2.0 home brew package missing

2016-08-26 Thread Noorul Islam K M
kalkimann writes: > Hi, > spark 1.6.2 is the latest brew package i can find. > spark 2.0.x brew package is missing, best i know. > > Is there a schedule when spark-2.0 will be available for "brew install"? > Did you do a 'brew update' before searching. I installed

Testing --supervise flag

2016-08-01 Thread Noorul Islam K M
Hi all, I was trying to test --supervise flag of spark-submit. The documentation [1] says that, the flag helps in restarting your application automatically if it exited with non-zero exit code. I am looking for some clarification on that documentation. In this context, does application means

When worker is killed driver continues to run causing issues in supervise mode

2016-07-13 Thread Noorul Islam K M
Spark version: 1.6.1 Cluster Manager: Standalone I am experimenting with cluster mode deployment along with supervise for high availability of streaming applications. 1. Submit a streaming job in cluster mode with supervise 2. Say that driver is scheduled on worker1. The app started

Stage shows incorrect output size

2016-01-26 Thread Noorul Islam K M
Hi all, I am trying to copy data from one cassandra cluster to another using spark + cassandra connector. At the source I have around 200 GB of data But while running the spark stage shows output as 406 GB and the data is still getting copied. I wonder why is it showing this high a number.

Re: error writing to stdout

2015-12-21 Thread Noorul Islam K M
carlilek writes: > My users use Spark 1.5.1 in standalone mode on an HPC cluster, with a > smattering still using 1.4.0 > > I have been getting reports of errors like this: > > 15/12/21 15:40:33 ERROR FileAppender: Error writing stream to file >

Re: Cassandra Connection Issue with Spark-jobserver

2015-04-27 Thread Noorul Islam K M
Are you using DSE spark, if so are you pointing spark job server to use DSE spark? Thanks and Regards Noorul Anand anand.vi...@monotype.com writes: *I am new to Spark world and Job Server My Code :* package spark.jobserver import java.nio.ByteBuffer import

Re: failed to launch workers on spark

2015-03-27 Thread Noorul Islam K M
mas mas.ha...@gmail.com writes: Hi all! I am trying to install spark on my standalone machine. I am able to run the master but when i try to run the slaves it gives me following error. Any help in this regard will highly be appreciated.

Re: What is best way to run spark job in yarn-cluster mode from java program(servlet container) and NOT using spark-submit command.

2015-03-26 Thread Noorul Islam K M
Sandy Ryza sandy.r...@cloudera.com writes: Creating a SparkContext and setting master as yarn-cluster unfortunately will not work. SPARK-4924 added APIs for doing this in Spark, but won't be included until 1.4. -Sandy Did you look into something like [1]? With that you can make rest API

Re: Combining Many RDDs

2015-03-26 Thread Noorul Islam K M
had other performance issues with spark cassandra connector. Thanks and Regards Noorul On Thu, Mar 26, 2015 at 1:13 PM, Noorul Islam K M noo...@noorul.com wrote: sparkx y...@yang-cs.com writes: Hi, I have a Spark job and a dataset of 0.5 Million items. Each item performs some sort

Re: Combining Many RDDs

2015-03-26 Thread Noorul Islam K M
sparkx y...@yang-cs.com writes: Hi, I have a Spark job and a dataset of 0.5 Million items. Each item performs some sort of computation (joining a shared external dataset, if that does matter) and produces an RDD containing 20-500 result items. Now I would like to combine all these RDDs and

Re: What his the ideal method to interact with Spark Cluster from a Cloud App?

2015-03-26 Thread Noorul Islam K M
-you-run-your-spark-app-tp7935p7958.html Noorul Islam K M noo...@noorul.com writes: Hi all, We have a cloud application, to which we are adding a reporting service. For this we have narrowed down to use Cassandra + Spark for data store and processing respectively. Since cloud application

What his the ideal method to interact with Spark Cluster from a Cloud App?

2015-03-24 Thread Noorul Islam K M
Hi all, We have a cloud application, to which we are adding a reporting service. For this we have narrowed down to use Cassandra + Spark for data store and processing respectively. Since cloud application is separate from Cassandra + Spark deployment, what is ideal method to interact with Spark