Re: Wish for 1.4: upper bound on # tasks in Mesos

2015-08-11 Thread Haripriya Ayyalasomayajula
of tasks for jobs run on Mesos ? This would be a very simple yet effective way to prevent a job dominating the cluster. cheers, Tom -- Regards, Haripriya Ayyalasomayajula

Re: Controlling number of executors on Mesos vs YARN

2015-08-11 Thread Haripriya Ayyalasomayajula
...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula

Re: Controlling number of executors on Mesos vs YARN

2015-08-11 Thread Haripriya Ayyalasomayajula
chiling...@gmail.com wrote: My experience with Mesos + Spark is not great. I saw one executor with 30 CPU and the other executor with 6. So I don't think you can easily configure it without some tweaking at the source code. Sent from my iPad On 2015-08-11, at 2:38, Haripriya Ayyalasomayajula

Spark on Mesos - Shut down failed while running spark-shell

2015-07-27 Thread Haripriya Ayyalasomayajula
-- Regards, Haripriya Ayyalasomayajula

Re: Job submission API

2015-04-07 Thread HARIPRIYA AYYALASOMAYAJULA
/apache/hadoop/mapreduce/Job.html#submit() to submit Spark jobs to a Yarn cluster? I see in example that bin/spark-submit is what's out there, but couldn't find any APIs around it. Thanks, Prashant -- Regards vybs -- Regards, Haripriya Ayyalasomayajula Graduate Student Department

Error while installing Spark 1.3.0 on local machine

2015-03-21 Thread HARIPRIYA AYYALASOMAYAJULA
help. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

Re: Problem connecting to HBase

2015-03-16 Thread HARIPRIYA AYYALASOMAYAJULA
AM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello all, Thank you for your responses. I did try to include the zookeeper.znode.parent property in the hbase-site.xml. It still continues to give the same error. I am using Spark 1.2.0 and hbase 0.98.9. Could you please

Re: Problem connecting to HBase

2015-03-15 Thread HARIPRIYA AYYALASOMAYAJULA
are you using ? I assume it contains SPARK-1297 Cheers On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src

Re: Installing Spark Standalone to a Cluster

2015-01-23 Thread HARIPRIYA AYYALASOMAYAJULA
-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

job works well on small data set but fails on large data set

2014-11-06 Thread HARIPRIYA AYYALASOMAYAJULA
it to Double and on the large file it works till I get the mapOutput. But when I include the remaining part , it fails. Can someone please help me understand where I am going wrong? Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science

Function returning multiple Values - problem with using if-else

2014-10-24 Thread HARIPRIYA AYYALASOMAYAJULA
the same but I'm still not clear how it works in Scala/Spark. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

Re: Function returning multiple Values - problem with using if-else

2014-10-24 Thread HARIPRIYA AYYALASOMAYAJULA
,j)) } } On Fri, Oct 24, 2014 at 8:52 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, My map function will call the following function (inc) which should yield multiple values: def inc(x:Int, y:Int) ={ if(condition) { for(i - 0 to 7) yield(x, y+i

How to emit multiple keys for the same value?

2014-10-20 Thread HARIPRIYA AYYALASOMAYAJULA
if someone can suggest me what could be possible ways to do it. Thanks in advance. -- Regards, Haripriya Ayyalasomayajula

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
BUSINESS.* *E*: ji...@sellpoints.com javascript:_e(%7B%7D,'cvml','ji...@sellpoints.com'); *M*: *510.303.7751* On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com javascript:_e(%7B%7D,'cvml','aharipriy...@gmail.com'); wrote: Helo, Can you check if the jar

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
Or if it has something to do with the way you package your files - try another alternative method and see if it works On Monday, October 13, 2014, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Well in the cluster, can you try copying the entire folder and then run? For example my

Re: Can I run examples on cluster?

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula

Re: How does the Spark Accumulator work under the covers?

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
something like myRdd.map(x = sum += x) is “sum” being accumulated locally in any way, for each element or partition or node? Is “sum” a broadcast variable? Or does it only exist on the driver node? How does the driver node get access to the “sum”? Thanks, Areg -- Regards, Haripriya

Re: Help with using combineByKey

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
similar behavior with combineByKey(), will by faster than groupByKey() version. On Thu, Oct 9, 2014 at 9:28 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Sean, Thank you. It works. But I am still confused about the function. Can you kindly throw some light on it? I was going

Re: Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
of nonzero values, but you initialize the first element of the pair to v, not 1, in v = (v,1). Try v = (1,1) On Thu, Oct 9, 2014 at 4:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: I am a beginner to Spark and finding it difficult to implement a very simple reduce operation. I

Re: Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
wrote: Oh duh, sorry. The initialization should of course be (v) = (if (v 0) 1 else 0, 1) This gives the answer you are looking for. I don't see what Part2 is supposed to do differently. On Thu, Oct 9, 2014 at 6:14 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello Sean