Hi all!
Spark 1.6.1.
Anyone know how to implement custom DF filter to be later pushed down to
custom datasource?
To be short, I've managed to create custom Expression, implicitly add
methods with it to Column class, but I am stuck at the point where
Expression must be converted to Filter by
For your last point, spark-submit has:
>
> if [ -z "${SPARK_HOME}" ]; then
> export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
> fi
>
> Meaning the script would determine the proper SPARK_HOME variable.
>
> FYI
>
> On
ior...@granturing.com> wrote:
> Could you publish it as a library (to an internal repo) then you can
> simply use the “--packages" option? Also will help with versioning as you
> make changes, that way you’re not having to manually ship JARs around to
> your machines and users.
>
&g
Hello, guys!
I’ve been developing a kind of framework on top of spark, and my idea is to
bundle the framework jars and some extra configs with the spark and pass it
to other developers for their needs. So that devs can use this bundle and
run usual spark stuff but with extra flavor that