1. In DataSource api V1, we were able to create persistent tables over custom
data sources using SQL DDL using "createRelation", "buildScan", "schema"
etc:. Is there a way to achieve this in DataSource api V2?
2. In DataSource api V1, any schema changes in the underlying custom data
source is
It depends a bit on the data as well, but have you investigated in SparkUI
which executor/task becomes slowly?
Could it be also the database from which you load data?
> Am 18.07.2020 um 17:00 schrieb Yong Yuan :
>
>
> The spark job has the correct functions and logic. However, after several
The spark job has the correct functions and logic. However, after several
hours running, it becomes slower and slower. Are there some pitfalls in the
below code? Thanks!
val query = "(select * from meta_table) as meta_data"
val meta_schema = new StructType()
.add("config_id", BooleanType)