Turns out that part of my .tcshrc has a set of commands that are basically

if in interactive shell
  set useful stuff
endif

So, I think that if i move the 'useful stuff' block out to the rest of the
.tcshrc then those variables will get created when the shell is started to
run my scripts.
or, i can just c/p those variable commands into the pbs scripts, which I
think will be easier to maintain, anyway. Eventually i'm gonna write a perl
script to interactively write these scripts for me and execute them anyway.
Thanks for your help,
Brian

-----Original Message-----
From: Jeremy Enos [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 09, 2003 7:19 PM
To: Brian Williams
Cc: [EMAIL PROTECTED]
Subject: RE: [Oscar-users] Running a single job on multiple machines


Yes.. it starts new shells on each machine, and doesn't necessarily take 
all the env vars with it.  I suggest setting them in your script which you 
feed to qsub, although that is also only run on a single compute node.  If 
you want environment propagated to every node in your job, it gets worse-
you have to make a script which is called by your parallel launcher 
(pbsdsh, mpirun, or whatever you choose), which is called by your qsub 
target script, which is called by qsub.  Hope you followed that- it's ugly, 
but the reality.

         Jeremy


-------------------------------------------------------
This SF.NET email is sponsored by:
SourceForge Enterprise Edition + IBM + LinuxWorld = Something 2 See!
http://www.vasoftware.com
_______________________________________________
Oscar-users mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/oscar-users

Reply via email to