[ https://issues.apache.org/jira/browse/SPARK-45115?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sumanto Pal updated SPARK-45115: -------------------------------- Issue Type: New Feature (was: Bug) > No way to exclude jars setting to classpath while doing spark-submit > -------------------------------------------------------------------- > > Key: SPARK-45115 > URL: https://issues.apache.org/jira/browse/SPARK-45115 > Project: Spark > Issue Type: New Feature > Components: Spark Submit > Affects Versions: 3.4.1 > Reporter: Sumanto Pal > Priority: Blocker > Original Estimate: 336h > Remaining Estimate: 336h > > The challenge is whenever you do spark-submit to start the application, the > jars present in spark home directory gets added to classpath automatically > and there is no way to exclude specific jars from there. For example, we dont > want slf4j jars present in spark home directory to be setted in classpath as > in codebase slf4j is already there. Thus it causes conflicts in jars. This > forces user to change there codebase to support spark-submit or to manually > remove the jars from spark-home directory. This i believe is not right > practice as we deviating from using spark as it supposed to be and it causes > unfixable behaviors at various instances with no clue. Example linkages > errors are common with the jar conflicts. > > There is detailed stackoverflow question on this issue. > refer : > https://stackoverflow.com/questions/76476618/linkageerror-facing-while-doing-spark-submit > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org