Re: Controlling number of executors on Mesos vs YARN

2015-08-11 Thread Haripriya Ayyalasomayajula
y Lam wrote: > My experience with Mesos + Spark is not great. I saw one executor with 30 > CPU and the other executor with 6. So I don't think you can easily > configure it without some tweaking at the source code. > > Sent from my iPad > > On 2015-08-11, at 2:38, Hari

Re: Controlling number of executors on Mesos vs YARN

2015-08-10 Thread Haripriya Ayyalasomayajula
; -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Controlling-number-of-executors-on-Mesos-vs-YARN-tp20966.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> > > -- Regards, Haripriya Ayyalasomayajula

Re: Wish for 1.4: upper bound on # tasks in Mesos

2015-08-10 Thread Haripriya Ayyalasomayajula
t; On May 19, 2015, at 12:39 PM, Thomas Dudziak wrote: >>> > >>> > I read the other day that there will be a fair number of improvements >>> in 1.4 for Mesos. Could I ask for one more (if it isn't already in there): >>> a configurable limit for the number of tasks for jobs run on Mesos ? This >>> would be a very simple yet effective way to prevent a job dominating the >>> cluster. >>> > >>> > cheers, >>> > Tom >>> > >>> >>> >> >> -- Regards, Haripriya Ayyalasomayajula

Spark on Mesos - Shut down failed while running spark-shell

2015-07-27 Thread Haripriya Ayyalasomayajula
spark-bfd6c444-5346-4315-9501-1baed4d500de -- Regards, Haripriya Ayyalasomayajula

Re: Job submission API

2015-04-07 Thread HARIPRIYA AYYALASOMAYAJULA
re! Just had a quick question - is there a job submission API >> such as the one with hadoop >> >> https://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/mapreduce/Job.html#submit() >> to submit Spark jobs to a Yarn cluster? I see in example that >> bin/spark-subm

Error while installing Spark 1.3.0 on local machine

2015-03-21 Thread HARIPRIYA AYYALASOMAYAJULA
eatly appreciate any help. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

Re: Problem connecting to HBase

2015-03-16 Thread HARIPRIYA AYYALASOMAYAJULA
is classpath > issue. > > On Sun, Mar 15, 2015 at 10:04 AM, HARIPRIYA AYYALASOMAYAJULA < > aharipriy...@gmail.com> wrote: > >> Hello all, >> >> Thank you for your responses. I did try to include the >> zookeeper.znode.parent property in the hbas

Re: Problem connecting to HBase

2015-03-15 Thread HARIPRIYA AYYALASOMAYAJULA
ch Spark release are you using ? > I assume it contains SPARK-1297 > > Cheers > > On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA < > aharipriy...@gmail.com> wrote: > >> >> Hello, >> >> I am running a HBase test case. I am using the ex

Re: Installing Spark Standalone to a Cluster

2015-01-23 Thread HARIPRIYA AYYALASOMAYAJULA
- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

job works well on small data set but fails on large data set

2014-11-06 Thread HARIPRIYA AYYALASOMAYAJULA
file. I changed it to Double and on the large file it works till I get the mapOutput. But when I include the remaining part , it fails. Can someone please help me understand where I am going wrong? Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Compute

Re: Function returning multiple Values - problem with using "if-else"

2014-10-24 Thread HARIPRIYA AYYALASOMAYAJULA
<- 0 to y-16) > yield(x+1,j)) > } > } > > On Fri, Oct 24, 2014 at 8:52 PM, HARIPRIYA AYYALASOMAYAJULA > wrote: > > Hello, > > > > My map function will call the following function (inc) which should yield > > multiple values: > > > > > > de

Function returning multiple Values - problem with using "if-else"

2014-10-24 Thread HARIPRIYA AYYALASOMAYAJULA
array or a list and return the same but I'm still not clear how it works in Scala/Spark. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112

How to emit multiple keys for the same value?

2014-10-20 Thread HARIPRIYA AYYALASOMAYAJULA
thin it. It would be great if someone can suggest me what could be possible ways to do it. Thanks in advance. -- Regards, Haripriya Ayyalasomayajula

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
Or if it has something to do with the way you package your files - try another alternative method and see if it works On Monday, October 13, 2014, HARIPRIYA AYYALASOMAYAJULA < aharipriy...@gmail.com> wrote: > Well in the cluster, can you try copying the entire folder and then run? >

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
DOUBLE YOUR SALES,* > > > > *ONE OF US IS IN THE WRONG BUSINESS.* > > *E*: ji...@sellpoints.com > > > *M*: *510.303.7751* > > On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA < > aharipriy...@gmail.com > > wrote: > >> Helo, >>

Re: Spark can't find jars

2014-10-13 Thread HARIPRIYA AYYALASOMAYAJULA
otFoundException: ./joda-convert-1.2.jar (Permission denied) > > java.io.FileOutputStream.open(Native Method) > > java.io.FileOutputStream.(FileOutputStream.java:221) > > com.google.common.io.Files$FileByteSink.openStream(Files.java:223) > > com.google.common.io.Files$FileByteSink.openStream(Files.java:211) > > > Thanks, > Andy > > -- Regards, Haripriya Ayyalasomayajula

Re: Help with using combineByKey

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
gt; It has similar behavior with combineByKey(), will by faster than > groupByKey() version. > > On Thu, Oct 9, 2014 at 9:28 PM, HARIPRIYA AYYALASOMAYAJULA > wrote: > > Sean, > > > > Thank you. It works. But I am still confused about the function. Can you > > kin

Re: How does the Spark Accumulator work under the covers?

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
Let’s say you call something > like myRdd.map(x => sum += x) is “sum” being accumulated locally in any > way, for each element or partition or node? Is “sum” a broadcast variable? > Or does it only exist on the driver node? How does the driver node get > access to the “sum”? > Thanks

Re: Can I run examples on cluster?

2014-10-10 Thread HARIPRIYA AYYALASOMAYAJULA
7 PM, Theodore Si wrote: >>>> >>>>> Hi all, >>>>> >>>>> I want to use two nodes for test, one as master, the other worker. >>>>> Can I submit the example application included in Spark source code >>>>> tarball on master to let it run on the worker? >>>>> What should I do? >>>>> >>>>> BR, >>>>> Theo >>>>> >>>>> - >>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >>>>> For additional commands, e-mail: user-h...@spark.apache.org >>>>> >>>>> >>>> >>> >> > -- Regards, Haripriya Ayyalasomayajula

Re: Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
duh, sorry. The initialization should of course be (v) => (if (v > > 0) 1 else 0, 1) > This gives the answer you are looking for. I don't see what Part2 is > supposed to do differently. > > On Thu, Oct 9, 2014 at 6:14 PM, HARIPRIYA AYYALASOMAYAJULA > wrote: > &

Re: Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
;re just counting... > > On Thu, Oct 9, 2014 at 11:47 AM, HARIPRIYA AYYALASOMAYAJULA < > aharipriy...@gmail.com> wrote: > >> >> I am a beginner to Spark and finding it difficult to implement a very >> simple reduce operation. I read that is ideal to use combineByKey f

Re: Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
't the problem I think. > It sounds like you intend the first element of each pair to be a count > of nonzero values, but you initialize the first element of the pair to > v, not 1, in v => (v,1). Try v => (1,1) > > > On Thu, Oct 9, 2014 at 4:47 PM, HARIPRIYA AYYALASOMAYA

Help with using combineByKey

2014-10-09 Thread HARIPRIYA AYYALASOMAYAJULA
throws IOException,InterruptedException { int acc2=0; float frac_delay, percentage_delay; int acc1=0; for(IntWritable val : values) { if(val.get() > 0) { acc1++; } acc2++; } frac_delay = (float)acc1/acc2; percentage_delay = frac_delay * 100 ; pdelay.set(percentage_delay); context.write(key,pdelay); } } Please help. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula contact : 650-796-7112