Hi Jose,
The Exec actor has a waitForProcess parameter:
"If true, then actor will wait until subprocess completes. The default
is a boolean of value true."
Maybe try creating a subprocess that reads data and writes data, but
does not exit. I tried using "ping source.eecs.berkeley.edu" as the
command with waitForProcess set to true but I did not get any output,
which seems wrong.
One tip here is to right click on the External Execution (aka Exec)
actor and select "Listen to Actor". This brings up a window that
displays information about what the actor is doing.
In general, the issue here is that it is difficult to know what
constitutes a firing of the External Execution actor. In your case, it
sounds like somehow the Dakota process reads some data and then write
some data. How would downstream actors know when the Dakota actor is
done writing data?
I could see modifying the Exec actor so that it reads data from its
input port, sends that data to the subprocess and then waits until it
get a line of data back and sends the data to the output port. The
subprocess would not be terminated and would wait for fresh input. This
would require a little Java programming.
In terms of interfacing to external programs, there are several ways.
Kepler uses Ptolemy as its runtime execution engine and I'm more
familiar with Ptolemy, so I'll discuss this from the Ptolemy viewpoint.
Kepler has an interface to R. I'm not that familiar with it, maybe try
looking at the documentation.
Kepler and Ptolemy have a JNI-based interface to Matlab that uses a
Matlab C-based shared library to execute code. Developing such an
interface is fairly complex. These days, rather than using JNI, most
people use JNA, which is easier to use than JNI.
Ptolemy includes BCVTB in $PTII/lbnl. BCVTB uses sockets to connect to
subprocesses. There is a small C example there.
We are working on implementing the Functional Mock-up Interface (FMI) in
Ptolemy. This work is not complete. FMI is an emerging standard that
is primarily for use with Modelica, but could be useful for situations
like yours.
The way to proceed would be to check the documentation and mailing lists
for R, the Exec actor, Matlab and BCVTB.
On 12/7/12 10:42 AM, Jose Borreguero wrote:
Thanks for the reply!
In your example workflow, it seems that kepler creates a new
subprocess every time the loop is executed, because sed s/Foo/Bar
exits after execution. Unfortunately, I need the subprocess having
ownership of the external actor to remain for all loop iterations.
Every iteration it should receive the input and create an output, but
the process should not die.
Jose
On Fri, Dec 7, 2012 at 1:34 PM, Christopher Brooks
<[email protected] <mailto:[email protected]>> wrote:
I'm not that familiar with Kepler's ExternalExecution actor, but
it looks like it uses ptolemy.actor.lib.Exec.
The Exec actor reads data from an input port, executes a
subprocess and places the output on the output port.
You would need p1 to pass parameters to Dakota in a format that
Dakota would understand.
You would need to process the Dakota output into a format that
your p3 process would understand.
In Synchronous Dataflow, you would need to put a SampleDelay in
the loop.
I've attached a XML file where the initial value of the
SampleDelay is the string "Foo"
The output of the SampleDelay is connected to the input of the
ExternalExecution actor.
The ExternalExecution actor executes the command
sed s/Foo/Bar/
During the first iteration, the value of the input is "Foo", so
the ExternalExecution actor substitutes "Bar".
For the second and subsequent iterations, the output is "Bar".
I'm not sure how Dakota take parameters. Typically, these sorts
of tools can take parameters via a file, via the command line or
via an input.
You would need to write code that would take the value of Kepler
parameters and generate Dakota parameters.
See the ExpressionReader, ExpressionWriter, Token To Expression
and Expression To Token actors.
_Christopher
On 12/5/12 9:25 AM, Jose Borreguero wrote:
Dear kepler users,
Newbie to kepler and trying to figure out how to embed a call to
the Dakota optimization package within a workflow.
When I start the workflow, I spawn a new process (process ID p1).
Using the ExternalExecution actor will create a new process for
Dakota (p2). The problem is that Dakota create i turn a new
process (p3) in order to evaluate the cost function for a given
set of values of the parameters to be optimized. This cost
function is a multi-step calculation which I wanted to model as
part of the overall workflow. The problem is that the cost
function has process ID p3, but the workflow has process ID p1.
How can I connect all these different processes in a single
workflow? Is this possible?
Below is a very simplified workflow. Because it is an
optimization process, there is a loop between Dakota and
CostFunction evaluations.
InitParams(p1)----->ExternalActorDakota(p2)------>CompositeActorCostFunction(p3)---|
^ |
|_____________________________________________________|
Any help is much appreciated
Jose
_______________________________________________
Kepler-users mailing list
[email protected] <mailto:[email protected]>
http://lists.nceas.ucsb.edu/kepler/mailman/listinfo/kepler-users
--
Christopher Brooks, PMP University of California
CHESS Executive Director US Mail: 337 Cory Hall
Programmer/Analyst CHESS/Ptolemy/Trust Berkeley, CA 94720-1774
ph:510.643.9841 <tel:510.643.9841>
(Office: 545Q Cory)
home: (F-Tu)707.665.0131 <tel:707.665.0131> cell:707.332.0670 <tel:707.332.0670>
--
Christopher Brooks, PMP University of California
CHESS Executive Director US Mail: 337 Cory Hall
Programmer/Analyst CHESS/Ptolemy/Trust Berkeley, CA 94720-1774
ph: 510.643.9841 (Office: 545Q Cory)
home: (F-Tu) 707.665.0131 cell: 707.332.0670
_______________________________________________
Kepler-users mailing list
[email protected]
http://lists.nceas.ucsb.edu/kepler/mailman/listinfo/kepler-users