Hi,Jan and Ioan
  I have study some papers of you.Thank both of you, your papers and
infomation offered me greate help!  
  Indeed we have implemented a system designed for EST bioinfomatics computing.
  We used two methods to handle a large number of small jobs. First package
the small jobs into "big" jobs. Second using a streamlined dispatcher (or 
submitter).
These two methods can reduce the overhead greatly. 
  We made specail client program to reduce the number of tasks,which simplely 
compress the some small EST files into one tar file with linux tar command.Then 
submit
the "big" jobs into target site via web service run on gt4 container.Admittedly 
this
method place a burden on the user. But,in the future,we will redesign some easy 
and common use APIs to solve this problem.
  A notification mechanism has been used to support the streamlined 
submitter.When a "big"
job complete,the execute program on target site will push a finished message to 
the client
program,so the client can submite a new job to the free resource immediately.
  There should be more research on handling large number of small jobs,so 
discuss or argument
will be necessarily and welcome. Thank both of you again :-)
  Kind regards! 
Li Hui

Reply via email to