[ 
https://issues.apache.org/jira/browse/SPARK-45115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17764198#comment-17764198
 ] 

Sumanto Pal edited comment on SPARK-45115 at 9/12/23 1:55 PM:
--------------------------------------------------------------

Challenge is we are using Firm recommended FAT jar which has slf4j and logging 
jars which are kind of customised. And we cannot go away with it as its not 
allowed. So we were hoping if spark could provide any way to exclude jar from 
spar-submit. [~yao]  
I believe this Jira will help user to avoid conflicts and onboard spark to 
there systems more conviniently as version issue sare very common with spark. 

I can work on the code fix if we believe this fix is helpful.


was (Author: JIRAUSER302177):
Challenge is we are using Firm recommended FAT jar which has slf4j and logging 
jars which are kind of customised. And we cannot go away with it as its not 
allowed. So we were hoping if spark could provide any way to exclude jar from 
spar-submit. [~yao]  
I belive this Jira will help user to avoid conflicts and onboard spark to there 
systems more conviniently as version issue sare very common with spark. 

I can work on the code fix if we believe this fix is helpful.

> No way to exclude jars setting to classpath while doing spark-submit
> --------------------------------------------------------------------
>
>                 Key: SPARK-45115
>                 URL: https://issues.apache.org/jira/browse/SPARK-45115
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 3.4.1
>            Reporter: Sumanto Pal
>            Priority: Blocker
>   Original Estimate: 336h
>  Remaining Estimate: 336h
>
> The challenge is whenever you do spark-submit to start the application, the 
> jars present in spark home directory gets added to classpath automatically 
> and there is no way to exclude specific jars from there. For example, we dont 
> want slf4j jars present in spark home directory to be setted in classpath as 
> in codebase slf4j is already there. Thus it causes conflicts in jars. This 
> forces user to change there codebase to support spark-submit or to manually 
> remove the jars from spark-home directory. This i believe is not right 
> practice as we deviating from using spark as it supposed to be and it causes 
> unfixable behaviors at various instances with no clue. Example linkages 
> errors are common with the jar conflicts. 
>  
> There is detailed stackoverflow question on this issue. 
> refer : 
> https://stackoverflow.com/questions/76476618/linkageerror-facing-while-doing-spark-submit
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to