Hey Nick, I did something similar with a Docker image last summer; I haven't updated the images to cache the dependencies for the current Spark master, but it would be trivial to do so:
http://chapeau.freevariable.com/2014/08/jvm-test-docker.html best, wb ----- Original Message ----- > From: "Nicholas Chammas" <nicholas.cham...@gmail.com> > To: "Spark dev list" <dev@spark.apache.org> > Sent: Tuesday, January 20, 2015 6:13:31 PM > Subject: Standardized Spark dev environment > > What do y'all think of creating a standardized Spark development > environment, perhaps encoded as a Vagrantfile, and publishing it under > `dev/`? > > The goal would be to make it easier for new developers to get started with > all the right configs and tools pre-installed. > > If we use something like Vagrant, we may even be able to make it so that a > single Vagrantfile creates equivalent development environments across OS X, > Linux, and Windows, without having to do much (or any) OS-specific work. > > I imagine for committers and regular contributors, this exercise may seem > pointless, since y'all are probably already very comfortable with your > workflow. > > I wonder, though, if any of you think this would be worthwhile as a > improvement to the "new Spark developer" experience. > > Nick > --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org