Re: s3 vfs on Mesos Slaves

2015-05-13 Thread jay vyas
Might I ask why vfs? I'm new to vfs and not sure wether or not it predates the hadoop file system interfaces (HCFS). After all spark natively supports any HCFS by leveraging the hadoop FileSystem api and class loaders and so on. So simply putting those resources on your classpath should be suffi

Re: Keep or remove Debian packaging in Spark?

2015-02-10 Thread jay vyas
mment-72114226 > >>> >> > >>> >> and in recent conversations I didn't hear dissent to the idea of > >>> >> removing this. > >>> >> > >>> >> Is this still useful enough to fix up? All else equal I'd like to > >>> >> start to walk back some of the complexity of the build, but I > >>> >> don't know how all-else-equal it is. Certainly, it sounds like > >>> >> nobody intends these to be used to actually deploy Spark. > >>> >> > >>> >> I don't doubt it's useful to someone, but can they maintain the > >>> >> packaging logic elsewhere? > >>> >> > >>> >> -- > >>> >> --- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For > >>> >> additional commands, e-mail: dev-h...@spark.apache.org > >>> >> > >>> >> > >>> > >>> - > >>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For > >>> additional commands, e-mail: dev-h...@spark.apache.org > >>> > >> > > > > - > > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional > commands, e-mail: dev-h...@spark.apache.org > > > > > > - > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org > For additional commands, e-mail: dev-h...@spark.apache.org > > -- jay vyas

spark akka fork : is the source anywhere?

2015-01-28 Thread jay vyas
from and who is maintaining its release? -- jay vyas PS I've had some conversations with will benton as well about this, and its clear that some modifications to akka are needed, or else a protobug error occurs, which amount to serialization incompatibilities, hence if one wants to build

Re: Standardized Spark dev environment

2015-01-20 Thread jay vyas
> > If we use something like Vagrant, we may even be able to make it so > that > > a > > > single Vagrantfile creates equivalent development environments across > OS > > X, > > > Linux, and Windows, without having to do much (or any) OS-specific > work. > > > > > > I imagine for committers and regular contributors, this exercise may > seem > > > pointless, since y'all are probably already very comfortable with your > > > workflow. > > > > > > I wonder, though, if any of you think this would be worthwhile as a > > > improvement to the "new Spark developer" experience. > > > > > > Nick > > > > > > ​ > -- jay vyas

Re: EndpointWriter : Dropping message failure ReliableDeliverySupervisor errors...

2014-12-20 Thread jay vyas
Hi folks. In the end, I found that the problem was that I was using IP Addresses instead of hostnames. I guess, maybe, reverse dns is a requirement for spark slave -> master communications... ? On Fri, Dec 19, 2014 at 7:21 PM, jay vyas wrote: > Hi spark. Im trying to understand th

EndpointWriter : Dropping message failure ReliableDeliverySupervisor errors...

2014-12-19 Thread jay vyas
-- jay vyas

Is there a way for scala compiler to catch unserializable app code?

2014-11-16 Thread jay vyas
th a scala singleton, which i guess is readily serializable. So its clear that spark needs to serialize objects which carry the driver methods for an app, in order to run... but I'm wondering,,, maybe there is a way to change or update the spark API to catch unserializable spark apps at compile

Re: best IDE for scala + spark development?

2014-10-26 Thread Jay Vyas
I tried the scala eclipse ide but in scala 2.10 I ran into some weird issues http://stackoverflow.com/questions/24253084/scalaide-and-cryptic-classnotfound-errors ... So I switched to IntelliJ and was much more satisfied... I've written a post on how I use fedora,sbt, and intellij for spark apps

Re: OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread Jay Vyas
, Mubarak Seyed wrote: > > What is your ulimit value? > > >> On Tue, Aug 26, 2014 at 5:49 PM, jay vyas >> wrote: >> Hi spark. >> >> I've been trying to build spark, but I've been getting lots of oome >> exceptions. >

OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread jay vyas
o hard code the "get_mem_opts" function, which is in the sbt-launch-lib.bash file, to have various very high parameter sizes (i.e. -Xms5g") with high MaxPermSize, etc... and to no avail. Any thoughts on this would be appreciated. I know of others having the same problem as well. Thanks! -- jay vyas