Le 4 mai 2018 17:55, "Lukasz Cwik" <lc...@google.com> a écrit :

I did take a look at Graal a while back when thinking about how execution
environments could be defined, my concerns were related to it not
supporting all of the features of a language.
For example, its typical for Python to load and call native libraries and
Graal can only execute C/C++ code that has been compiled to LLVM.
Also, a good amount of people interested in using ML libraries will want
access to GPUs to improve performance which I believe that Graal can't
support.

It can be a very useful way to run simple lamda functions written in some
language directly without needing to use a docker environment but you could
probably use something even lighter weight then Graal that is language
specific like Jython.



Right, the jsr223 impl works very well but you can also have a perf boost
using native (like v8 java binding for js for instance). It is way more
efficient than docker most of the time and not code intrusive at all in
runners so likely more adoption-able and maintainable. That said all is
doable behind the jsr223 so maybe not a big deal in terms of api. We just
need to ensure portability work stay clean and actually portable and doesnt
impact runners as poc done until today did.

Works for me.


On Thu, May 3, 2018 at 10:05 PM Romain Manni-Bucau <rmannibu...@gmail.com>
wrote:

> Hi guys
>
> Since some time there are efforts to have a language portable support in
> beam but I cant really find a case it "works" being based on docker except
> for some vendor specific infra.
>
> Current solution:
>
> 1. Is runner intrusive (which is bad for beam and prevents adoption of big
> data vendors)
> 2. Based on docker (which assumed a runtime environment and is very
> ops/infra intrusive and likely too $$ quite often for what it brings)
>
> Did anyone had a look to graal which seems a way to make the feature
> doable in a lighter manner and optimized compared to default jsr223 impls?
>
>

Reply via email to