Hi,

I’m starting to use HCP Pipelines in the WashU CHPC cluster, and got a bit 
confused on the way jobs are submitted to it, so I’d like to confirm a few 
details about this.
As far as I could tell, since the cluster takes PBS jobs, and because fsl_sub 
works only in Sun Grid Engines(?), the way to run a Pipelines script
would be creating a PBS script that executes the Pipelines script with the 
‘—runlocal’ command line option.
For example, to run the PreFreeSurferPipelineBatch.sh script, the PBS script 
command would be  '.PreFreeSurferPipelineBatch.sh —runlocal'

>From the code(*) I assume that given this command, the fsl_sub queuing is 
>bypassed, and when the job is submitted to the cluster it will use the options 
>specified in the PBS script.
Is that correct? 

Thanks,
Carolina.

______________________
(*) For example, these lines in PreFreeSurferPipelineBatch.sh:
"if [ -n "${command_line_specified_run_local}" ] ; then
                echo "About to run 
${HCPPIPEDIR}/PreFreeSurfer/PreFreeSurferPipeline.sh"
                queuing_command=""
  else
                echo "About to use fsl_sub to queue or run 
${HCPPIPEDIR}/PreFreeSurfer/PreFreeSurferPipeline.sh"
                queuing_command="${FSLDIR}/bin/fsl_sub ${QUEUE}"
  fi"





_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to