Hi,
Looking for comments/your view:
Would it be possible to:
1. patch datafusion dataframe to make df.state public
2. patch datafusion adding method to dataframe ie:
df.transform_logical_plan(mut self, new_plan) -> df where some
original plan could be modified / injected with NewPlanNode
(UserDe
Hi,
Quick question: is UDF/UDAF working in Ballista?
I saw "TODO" in the executor part :
```rust
// TODO add logic to dynamically load UDF/UDAFs libs from files
scalar_functions: HashMap::new(),
aggregate_functions: HashMap::new(),
```
To create an example library and add reading functionality he
king related to listing tables. If you are looking for support
> beyond this, I'd like to hear the use for more help.
>
> Mete.
>
> On Sat, Apr 1, 2023 at 11:07 PM Jaroslaw Nowosad wrote:
>
> > Hi,
> >
> > Looking for advice:
> > I'm lookin
Hi,
Looking for advice:
I'm looking into creating a writer part for ballista.
There is a data source but not a sink.
I started looking into object store -> put/put_multipart.
But looks like simple context extension is not enough - do I need to
extend logical/physical plan?
If you have any pointer
; [1] - https://github.com/dask-contrib/dask-sql
> [2] -
>
> https://github.com/dask-contrib/dask-sql/blob/main/dask_planner/src/parser.rs#L385
>
> On Thu, Jan 12, 2023 at 10:36 AM Jaroslaw Nowosad
> wrote:
>
> > Hi all,
> >
> > I had a task to investigate h
Hi all,
I had a task to investigate how to extend Datafusion to add UDFs written in
plain SQL.
Reason behind: there is quite a big bunch of SQL UDF in existing java
(spark) solutions, however we are starting to move into the Rust ecosystem
and Datafussion/Arrow/Ballista looks like the proper way.
Hi,
I am just trying to integrate datafusion with kafka, final goal is to have
end-to-end streaming. But I started from a "different side" -> step 1 is to
publish output to kafka, so I copied code/ created kafka publisher:
https://github.com/yarenty/arrow-datafusion/tree/master/datafusion/core/src