For example, now I had my custom table source or sink which were builed in a independent jar , and my main code will depend on it. But I don't want to package custom connector jar with main code in a jar flie. In other words, I want to get a thin jar not a fat jar. So I think I can put the custom connector jar in *flink/lib*. before I run my job. In fact, it really work.My jobmanager yaml like as below:->>>>>>>>containers: ... volumeMounts: - mountPath: /opt/flink/conf name: flink-config-volume - mountPath: /opt/flink/lib name: volume-1618910657181 - mountPath: /opt/flink/flink-uploadjar name: volume-1618911748381 - mountPath: /opt/flink/plugins/oss-fs-hadoop/flink-oss-fs-hadoop-1.12.2.jar name: volume-1618916463815 volumes: - configMap: defaultMode: 420 items: - key: flink-conf.yaml path: flink-conf.yaml - key: log4j-console.properties path: log4j-console.properties name: flink-config name: flink-config-volume - hostPath: path: /data/volumes/flink/volume-for-session/cxylib-common-jar type: '' name: volume-1618910657181 - hostPath: path: /home/uploadjar type: '' name: volume-1618911748381 - hostPath: path: /data/volumes/flink/volume-for-session/plugins/oss-fs-hadoop/flink-oss-fs-hadoop-1.12.2.jar type: '' name: volume-1618916463815->>>>>>>>As the yaml, I have to mount Host Machine path to container path.Now I deploy flink in k8s cluster which has three nodes, so I have to put my all jar in three nodes.And then If I change some codes, I also have to package and put then in three nodes.So if flink support to config the flink to load lib in myself path, I can use aliyun oss pv and pvc to mount oss path directly. Like my other yaml as below:->>>>>>>>containers: ... volumeMounts: - mountPath: /data name: volume-trino-volume ... volumes: - name: volume-trino-volume persistentVolumeClaim: claimName: trino-volume ... ->>>>>>>>So if flink support to config like "flink.lib.path : /data/myself/lib", it will very convenient.I don't know if you know what I mean.
-- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/