[GitHub] [hudi] TranHuyTiep commented on issue #8340: [SUPPORT] cannot assign instance of java.lang.invoke.SerializedLambda

2023-05-08 Thread via GitHub
TranHuyTiep commented on issue #8340: URL: https://github.com/apache/hudi/issues/8340#issuecomment-1538193791 > @TranHuyTiep Were you able to resolve this issue or still facing the same? I solved the above problem by build new image and copy all packages in .ivy2/jars/* to

[GitHub] [hudi] TranHuyTiep commented on issue #8340: [SUPPORT] cannot assign instance of java.lang.invoke.SerializedLambda

2023-04-01 Thread via GitHub
TranHuyTiep commented on issue #8340: URL: https://github.com/apache/hudi/issues/8340#issuecomment-1493224689 > is it can work in your local ? not k8s ? yes, it can work in local I set up spark_conf.setMaster("local[*]") can work in k8s but not create executor and run in one driver

[GitHub] [hudi] TranHuyTiep commented on issue #8340: [SUPPORT] cannot assign instance of java.lang.invoke.SerializedLambda

2023-04-01 Thread via GitHub
TranHuyTiep commented on issue #8340: URL: https://github.com/apache/hudi/issues/8340#issuecomment-1492861891 > can you provide a simple reproduce step, like `code` or `sql` Here is my code `# -*- coding: utf-8 -*- from pyspark.conf import SparkConf from pyspark.sql