TranHuyTiep commented on issue #8340:
URL: https://github.com/apache/hudi/issues/8340#issuecomment-1538193791
> @TranHuyTiep Were you able to resolve this issue or still facing the same?
I solved the above problem by build new image and copy all packages in
.ivy2/jars/* to /opt/spark/jars
TranHuyTiep commented on issue #8340:
URL: https://github.com/apache/hudi/issues/8340#issuecomment-1493224689
> is it can work in your local ? not k8s ?
yes, it can work in local
I set up spark_conf.setMaster("local[*]") can work in k8s but not create
executor and run in one driver
TranHuyTiep commented on issue #8340:
URL: https://github.com/apache/hudi/issues/8340#issuecomment-1492861891
> can you provide a simple reproduce step, like `code` or `sql`
Here is my code
`# -*- coding: utf-8 -*-
from pyspark.conf import SparkConf
from pyspark.sql import