Also a fan of the Docker approach, especially when done right it can provide a 
long-term container image that makes it easy for folks to get up and running, 
and subsequently contribute.

I poked at a couple other projects I'm familiar with that depend on Spark 
(Iceberg and Hudi) and Hudi seems to assume the user already has a Spark 
installation present, but I couldn't tell for Iceberg. 

On 2022/11/22 21:09:57 Sumit Kumar wrote:
> Folks have experimented with docker containers at my workplace and it has 
> helped in giving every developer a consistent environment (all the 
> dependencies installed, so no setup quirks) for testing. We do remote 
> debugging which has it's own quirks but we have found it to be useful. 
> Would the community be interested in that?
> 
> On 2022/11/21 17:37:08 larry mccay wrote:
> > Yes, from a Livy perspective it is best to assume Spark local mode - it may
> > be best to point to an existing Spark page for that.
> > We need a deterministic approach to setting up some dev environment with
> > enough description that it can be expanded upon to use full cluster
> > deployment as well.
> > 
> > 
> > On Mon, Nov 21, 2022 at 5:10 AM 严细浪 <xilang....@gmail.com> wrote:
> > 
> > > I agree with Jeff on this topic.
> > > Because usually user setup Spark env before using Livy, and Spark env
> > > depends on cluster env like YARN/HDFS, and has many variants on
> > > configuration that is not Livy's scope.
> > > For dev purpose, we can include a guide to setup spark on local mode.
> > >
> > > Jeff Zhang <zjf...@gmail.com> 于2022年11月21日周一 13:22写道:
> > >
> > > > Here's the get started page, https://livy.apache.org/get-started/
> > > > I suppose it is enough for new users.  But agree we need to update the
> > > > README.md to include these contents.
> > > >
> > > >
> > > >
> > > > On Mon, Nov 21, 2022 at 2:55 AM larry mccay <lmc...@apache.org> wrote:
> > > >
> > > > > Considering there is no download for anything older than 3.2.x on the
> > > > > referred download page, we likely need some change to the README.md to
> > > > > reflect a more modern version.
> > > > > We also need more explicit instructions for installing Spark than just
> > > > the
> > > > > download. Whether we detail this or point to Spark docs that are
> > > > sufficient
> > > > > is certainly a consideration.
> > > > >
> > > > > At the end of the day, we are missing any sort of quick start guide 
> > > > > for
> > > > > devs to be able to successfully build and/or run tests.
> > > > >
> > > > > Thoughts?
> > > > >
> > > > > On Sat, Nov 19, 2022 at 6:23 PM larry mccay <larry.mc...@gmail.com>
> > > > wrote:
> > > > >
> > > > > > Hey Folks -
> > > > > >
> > > > > > Our Livy README.md indicates the following:
> > > > > >
> > > > > > To run Livy, you will also need a Spark installation. You can get
> > > Spark
> > > > > > releases at https://spark.apache.org/downloads.html.
> > > > > >
> > > > > > Livy requires Spark 2.4+. You can switch to a different version of
> > > > Spark
> > > > > > by setting the SPARK_HOME environment variable in the Livy server
> > > > > > process, without needing to rebuild Livy.
> > > > > >
> > > > > > Do we have any variation on this setup at this point in the real
> > > world?
> > > > > >
> > > > > > What do your dev environments actually look like and how are you
> > > > > > installing what versions of Spark as a dependency?
> > > > > >
> > > > > > Thanks!
> > > > > >
> > > > > > --larry
> > > > > >
> > > > >
> > > >
> > > >
> > > > --
> > > > Best Regards
> > > >
> > > > Jeff Zhang
> > > >
> > >
> > 
> 

Reply via email to