[ 
https://issues.apache.org/jira/browse/SPARK-41119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lingaraj Gowdar updated SPARK-41119:
------------------------------------
    Description: 
*Background -*

I was trying to run multiple tests by passing the script / file to the 
different spark shells. All shells except PySpark accepts script as command 
line argument and processes the script. The alternate way to pass a file via 
commandline arguments is by using (<) operator on a shell like below 

_# pyspark --master yarn --deploy-mode client < python-example-script.py_

 

*Feature suggested -*

As PySpark can be used only in client mode and If the same functionality can be 
added for PySpark, it will help if anyone wants to try out only PySpark instead 
of running it via spark-submit.

spark-submit supports everything the spark shells{_}(accepting scripts with -i 
or -f option){_} can do but if there is a way to pass a script / file as 
argument to other spark shells(spark-shell, spark-sql) then why not the same 
can be provided with PySpark.

 

 

  was:
*Background -*

I was trying to run multiple tests by passing the script / file to the 
different spark shells. All shells except PySpark accepts script as command 
line argument and processes the script. The alternate way to pass a file via 
commandline arguments is by using (<) operator on a shell like below 

_# pyspark --master yarn --deploy-mode client < python-example-script.py_

 

*Improvement suggested -*

As PySpark can be used only in client mode and If the same functionality can be 
added for PySpark, it will help if anyone wants to try out only PySpark instead 
of running it via spark-submit.

spark-submit supports everything the spark shells{_}(accepting scripts with -i 
or -f option){_} can do but if there is a way to pass a script / file as 
argument to other spark shells(spark-shell, spark-sql) then why not the same 
can be provided with PySpark.

 

 


> Python file to be passed as argument using command-line option for PySpark
> --------------------------------------------------------------------------
>
>                 Key: SPARK-41119
>                 URL: https://issues.apache.org/jira/browse/SPARK-41119
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>    Affects Versions: 3.2.2
>            Reporter: Lingaraj Gowdar
>            Priority: Major
>
> *Background -*
> I was trying to run multiple tests by passing the script / file to the 
> different spark shells. All shells except PySpark accepts script as command 
> line argument and processes the script. The alternate way to pass a file via 
> commandline arguments is by using (<) operator on a shell like below 
> _# pyspark --master yarn --deploy-mode client < python-example-script.py_
>  
> *Feature suggested -*
> As PySpark can be used only in client mode and If the same functionality can 
> be added for PySpark, it will help if anyone wants to try out only PySpark 
> instead of running it via spark-submit.
> spark-submit supports everything the spark shells{_}(accepting scripts with 
> -i or -f option){_} can do but if there is a way to pass a script / file as 
> argument to other spark shells(spark-shell, spark-sql) then why not the same 
> can be provided with PySpark.
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to