zhongjiajie commented on issue #6881: [AIRFLOW-6326] Sort cli commands and arg
URL: https://github.com/apache/airflow/pull/6881#issuecomment-570850316
 
 
   > What do you think about not ignoring leading dash (key= lambda d: 
d.lstrip('-'))?
   I think maybe we should keep the dash, when we use single is just the flag 
on open or close of some function, but double dash mean want to get some 
parameter from command line. IMO their different, and I think we should sort 
them separate
   ```py
   usage: airflow tasks run [-h] [--cfg_path CFG_PATH] [--pool POOL] 
[--ship_dag]
                            [-A] [-f] [-i] [-I] [-int] [-l] [-m] [-p PICKLE]
                            [-sd SUBDIR]
                            dag_id task_id execution_date
   
   positional arguments:
     dag_id                The id of the dag
     task_id               The id of the task
     execution_date        The execution date of the DAG
   
   optional arguments:
     -h, --help            show this help message and exit
     --cfg_path CFG_PATH   Path to config file to use instead of airflow.cfg
     --pool POOL           Resource pool to use
     --ship_dag            Pickles (serializes) the DAG and ships it to the 
worker
     -A, --ignore_all_dependencies
                           Ignores all non-critical dependencies, including 
ignore_ti_state and ignore_task_deps
     -f, --force           Ignore previous task instance state, rerun 
regardless if task already succeeded/failed
     -i, --ignore_dependencies
                           Ignore task-specific dependencies, e.g. upstream, 
depends_on_past, and retry delay dependencies
     -I, --ignore_depends_on_past
                           Ignore depends_on_past dependencies (but respect 
upstream dependencies)
     -int, --interactive   Do not capture standard output and error streams 
(useful for interactive debugging)
     -l, --local           Run the task using the LocalExecutor
     -m, --mark_success    Mark jobs as succeeded without running them
     -p PICKLE, --pickle PICKLE
                           Serialized pickle object of the entire dag (used 
internally)
     -sd SUBDIR, --subdir SUBDIR
                           File location or directory from which to look for 
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value 
you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg'
   ```
   
   I think above is better than below
   
   ```py
   usage: airflow tasks run [-h] [-A] [--cfg_path CFG_PATH] [-f] [-i] [-I] 
[-int]
                            [-l] [-m] [-p PICKLE] [--pool POOL] [-sd SUBDIR]
                            [--ship_dag]
                            dag_id task_id execution_date
   
   positional arguments:
     dag_id                The id of the dag
     task_id               The id of the task
     execution_date        The execution date of the DAG
   
   optional arguments:
     -h, --help            show this help message and exit
     -A, --ignore_all_dependencies
                           Ignores all non-critical dependencies, including 
ignore_ti_state and ignore_task_deps
     --cfg_path CFG_PATH   Path to config file to use instead of airflow.cfg
     -f, --force           Ignore previous task instance state, rerun 
regardless if task already succeeded/failed
     -i, --ignore_dependencies
                           Ignore task-specific dependencies, e.g. upstream, 
depends_on_past, and retry delay dependencies
     -I, --ignore_depends_on_past
                           Ignore depends_on_past dependencies (but respect 
upstream dependencies)
     -int, --interactive   Do not capture standard output and error streams 
(useful for interactive debugging)
     -l, --local           Run the task using the LocalExecutor
     -m, --mark_success    Mark jobs as succeeded without running them
     -p PICKLE, --pickle PICKLE
                           Serialized pickle object of the entire dag (used 
internally)
     --pool POOL           Resource pool to use
     -sd SUBDIR, --subdir SUBDIR
                           File location or directory from which to look for 
the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value 
you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg'
     --ship_dag            Pickles (serializes) the DAG and ships it to the 
worker
   ```
   
   what do you think? @mik-laj 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to