@Saisai , Is there a design doc for the python/r/sql interpreter that I can
use as a template? It will be very similar (most similar to R interpreter
in fact) in design.


Regards,
-Steve

On Tue, Jan 14, 2020 at 6:13 PM Steve Suh <suhst...@gmail.com> wrote:

> Jeff you are correct.  .NET for Spark behaves similarly to how pyspark and
> sparkR interacts with spark.  Much like its counterparts, .NET for Spark
> acts as a wrapper to the underlying spark apis.  Besides adding a
> SparkDotnetInterpreter to the livy codebase, a user will need to install a
> REPL on their system and set either an ENV variable or a spark conf to
> point to the binary (similar to python and R).  In our case, we are
> focusing on supporting the dotnet-try <https://github.com/dotnet/try>
> REPL.
>
>
> Regards,
> -Steve
>
> On Mon, Jan 13, 2020 at 8:09 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>> IIUC, donet spark is just a wrapper of spark. Besides adding
>> DotNetSparkInterpreter, what kind of other things user need to do ? For
>> now, user also need to specify SPARK_HOME at least, but to support donet,
>> what other things user need to do.
>>
>>
>> Saisai Shao <sai.sai.s...@gmail.com> 于2020年1月14日周二 上午11:01写道:
>>
>> > I see your point. Personally I don't have a strong opinion on this, I'm
>> not
>> > sure what others think about it. Why don't you start a design doc and
>> call
>> > for a vote about this feature.
>> >
>> > Thanks
>> > Saisai
>> >
>> > Steve Suh <suhst...@gmail.com> 于2020年1月14日周二 上午10:12写道:
>> >
>> > > @Saisai
>> > >
>> > > This will not be a wrapper around the REST API.  The plan is to
>> support a
>> > > Livy POST /sessions request where kind == "sparkdotnet".  Livy will
>> > > eventually push this request down to the ReplDriver, and in turn the
>> > > ReplDriver will create a new SparkDotnet Interpreter class.  This
>> will be
>> > > similar to how the  PythonInterpreter, SparkRInterpreter, and
>> > > SQLInterpreter classes get instantiated and used.
>> > >
>> > >
>> > > Regards,
>> > > -Steve
>> > >
>> > > On Sun, Jan 12, 2020 at 5:45 PM Saisai Shao <sai.sai.s...@gmail.com>
>> > > wrote:
>> > >
>> > > > Is this just a wrapper of Livy REST API, or it could support Livy
>> Job
>> > > API?
>> > > >
>> > > > From my opinion, if it is just a wrapper of REST API, then it would
>> be
>> > > > better to maintain out of Livy, since REST API is language
>> independent,
>> > > if
>> > > > we're going to have all the languages support in Livy, then it is
>> hard
>> > to
>> > > > maintain.
>> > > >
>> > > > Just my two cents.
>> > > >
>> > > > Thanks
>> > > > Saisai
>> > > >
>> > > > Steve Suh <suhst...@gmail.com> 于2020年1月12日周日 下午3:49写道:
>> > > >
>> > > > > Hi,
>> > > > >
>> > > > > I contribute to the .NET for Apache Spark <
>> > > > https://github.com/dotnet/spark
>> > > > > >
>> > > > > project and we have seen *a lot* of *user interest* in providing
>> > first
>> > > > > class notebook support for *.NET for Apache Spark*.
>> > > > >
>> > > > > Livy currently supports *Scala*, *Python *and *R **interactive
>> > > sessions*.
>> > > > > We have a working prototype
>> > > > > <
>> > > > >
>> > > >
>> > >
>> >
>> https://github.com/dotnet/spark/tree/master/deployment/HDI-Spark/Notebooks
>> > > > > >
>> > > > > available
>> > > > > that adds support for a Spark *Dotnet **interactive session* and I
>> > > would
>> > > > > like to know if the Livy community would be interested in adding
>> this
>> > > > > feature to the main code base.  If so, please let me know and I
>> can
>> > > work
>> > > > on
>> > > > > creating this PR.
>> > > > >
>> > > > > For now, I've created a Jira item
>> > > > > <https://issues.apache.org/jira/browse/LIVY-742> to track this.
>> > > > >
>> > > > >
>> > > > > Regards,
>> > > > > -Steve
>> > > > >
>> > > >
>> > >
>> >
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

Reply via email to