Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/1845#issuecomment-52818953
  
    @pwendell my proposal is that the scala code should take care of figuring 
out all the arguments to be passed to the JVM / SparkSubmit. In my example, the 
python program is what the JVM invocation would be (i.e. replace "python -c ... 
${x[@]}" with "java ${x[@]}", and the "test" function would be an invocation to 
the "SparkSubmitBootstrap" class). Note there is no string parsing; I'm using 
"IFS" to say "each line is an element in the array", so the bootstrap code 
would print one argument per line (and not a single line with all arguments).
    
    I was just trying to show that it's possible to handle quoting, since 
@andrewor14 did not think it was possible.
    
    I'm ok with you guys just going with the current approach, but I think all 
this logic that is now split between bash and scala is getting way 
overcomplicated, and it would be much better to have everything in scala.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to