zhouyun0242 opened a new issue, #15519:
URL: https://github.com/apache/dolphinscheduler/issues/15519

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### What happened
   
   First, create a workflow and create a datax task within the workflow. The 
SQL defined in the datax task is as follows:
   
   `Select * from table where to_ Char (create-time, 'yyyy MM dd hh24: 
MI')>='${biz_start_time}' and to_ Char (create-time,'yyyy MM dd hh24: MI 
')<='${biz_end_time}'`
   
   Finally, save the workflow and add biz_ Start_ Time and Biz_ End_ Time is 
the global variable for this workflow.
   
   Run this workflow for biz_ Start_ Time and Biz_ End_ Time assignment. The 
workflow failed to run as a result, and the log indicates that there is an 
issue with the datax running statement. The log is as follows:
   
   > ${PYTHON_LAUNCHER} ${DATAX_LAUNCHER} --jvm="-Xms1G -Xmx1G" -p 
"-Dsystem.task.definition.name='datax-test' -Dsystem.project.name='null' 
-Dsystem.biz.curdate='20240122' -Dsystem.task.instance.id='1' 
-DStartParams='{"biz_start_time":"2024-01-03 18:56","biz_end_time":"2024-01-03 
18:57"}' -Dsystem.task.definition.code='12350595021984' 
-Dsystem.datetime='20240122183205' -Dbiz_end_time='2024-01-03 18:57' 
-Dbiz_start_time='2024-01-03 18:56' -Dsystem.project.code='12350591796128' 
-Dsystem.workflow.instance.id='1' -Dsystem.biz.date='20240121' 
-Dsystem.workflow.definition.name='test工作流' 
-Dsystem.workflow.definition.code='12350605116448'" 
/tmp/dolphinscheduler/exec/process/default/12350591796128/12350605116448_2/1/1/1_1_job.json
   [INFO] 2024-01-22 18:32:05.696 +0800 - Executing shell command : sudo -u 
root -i 
/tmp/dolphinscheduler/exec/process/default/12350591796128/12350605116448_2/1/1/1_1.sh
   [INFO] 2024-01-22 18:32:05.701 +0800 - process start, process id is: 14195
   [INFO] 2024-01-22 18:32:06.701 +0800 -  -> 
        
        DataX (DATAX-OPENSOURCE-3.0), From Alibaba !
        Copyright (C) 2010-2017, Alibaba Group. All Rights Reserved.
        
        
        Usage: datax.py [options] job-url-or-path
        
        Options:
          -h, --help            show this help message and exit
        
          Product Env Options:
            Normal user use these options to set jvm parameters, job runtime 
mode
            etc. Make sure these options can be used in Product Env.
        
            -j <jvm parameters>, --jvm=<jvm parameters>
                                Set jvm parameters if necessary.
            --jobid=<job unique id>
                                Set job unique id when running by 
Distribute/Local
                                Mode.
            -m <job runtime mode>, --mode=<job runtime mode>
                                Set job runtime mode such as: standalone, local,
                                distribute. Default mode is standalone.
            -p <parameter used in job config>, --params=<parameter used in job 
config>
                                Set job parameter, eg: the source tableName you 
want
                                to set it by command, then you can use like 
this:
                                -p"-DtableName=your-table-name", if you have 
mutiple
                                parameters: -p"-DtableName=your-table-name
                                -DcolumnName=your-column-name".Note: you should 
config
                                in you job tableName with ${tableName}.
            -r <parameter used in view job config[reader] template>, 
--reader=<parameter used in view job config[reader] template>
                                View job config[reader] template, eg:
                                mysqlreader,streamreader
            -w <parameter used in view job config[writer] template>, 
--writer=<parameter used in view job config[writer] template>
                                View job config[writer] template, eg:
                                mysqlwriter,streamwriter
        
          Develop/Debug Options:
            Developer use these options to trace more details of DataX.
        
            -d, --debug         Set to remote debug mode.
            --loglevel=<log level>
                                Set log level such as: debug, info, all etc.
   [INFO] 2024-01-22 18:32:06.703 +0800 - process has exited. execute 
path:/tmp/dolphinscheduler/exec/process/default/12350591796128/12350605116448_2/1/1,
 
   processId:14195 ,exitStatusCode:255 ,processWaitForStatus:true 
,processExitValue:255
   
   
   
   ### What you expected to happen
   
   I hope this workflow can run successfully and datax can extract data 
normally.
   
   Important note: After testing, the stable version of v3.1.9 does not have 
this issue.
   
   ### How to reproduce
   
   Deploy Dolphinscheduler 3.2.0 standalone version, create a new workflow, add 
a datax component under the workflow, define global variables for the workflow, 
and then run the workflow to reproduce the problem.
   
   ### Anything else
   
   _No response_
   
   ### Version
   
   3.2.x
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: 
[email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to