Re: [Factor-talk] how to run-pipeline

2015-09-24 Thread Björn Lindqvist
2015-09-23 5:46 GMT+02:00 HP Wei :
> My original issue was to construct the string “cmd1 -a A -b B” and “cmd2 -c
> C”
> in a flexible way so that I can choose to supply (or not supply) those
> argument A, B, C.
> And find a clean way to put everything together into { … } for run-pipeline.

Maybe with some helper words:

: render-command ( cmd args -- string )
[ dup array? [ " " join ] when ] map " " join " " glue ;

: make-pipeline ( seq -- seq' )
[ first2 render-command ] map ;

:: special-pipeline ( a-arg b-arg -- pipeline )
a-arg "def-a" or :> real-a
b-arg "def-b" or :> real-b
{ { "ls" { } } { "grep" { { "-a" real-a } { "-b" real-b } "--verbose" } } }
make-pipeline ;

Or simpler:

: special-pipeline ( a-arg b-arg -- pipeline )
[ "def-a" or ] [ "def-b" or ] bi* "grep -a %s -b %s" sprintf
"ls" swap 2array ;

IN: scratchpad f f special-pipeline
{ "ls" "grep -a def-a -b def-b" }

Factor doesn't have words with variable number of arguments, so you
supply f instead and then the default is picked.


-- 
mvh/best regards Björn Lindqvist

--
Monitor Your Dynamic Infrastructure at Any Scale With Datadog!
Get real-time metrics from all of your servers, apps and tools
in one place.
SourceForge users - Click here to start your Free Trial of Datadog now!
http://pubads.g.doubleclick.net/gampad/clk?id=241902991=/4140
___
Factor-talk mailing list
Factor-talk@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/factor-talk


Re: [Factor-talk] how to run-pipeline

2015-09-23 Thread Alex Vondrak
>
> Your suggestion in another email of implementing >process looks
> interesting to explore.
> Any example usage in this case ?
>

Not really, I'm afraid. The only two implemented methods are here:
https://github.com/slavapestov/factor/blob/master/basis/io/launcher/launcher.factor#L118-L122
It shouldn't be too hard to define your own >process for your special class
that would serialize a string/array then call >process on that
string/array. In fact, it could probably just look like

M: cmd >process cmd>string >process ;

where you implement cmd>string accordingly.

You might still be able to go the `make` route, since strings and arrays
are already launch descriptors (
http://docs.factorcode.org/content/article-io.launcher.descriptors.html)
that will work when wrapped in processes. Since strings are sequences,
`make` can build them too. For example,
https://github.com/slavapestov/factor/blob/master/basis/csv/csv.factor#L33-L34
Notice from that example that `,` and `%` are called from separate words:

https://github.com/slavapestov/factor/blob/master/basis/csv/csv.factor#L23
https://github.com/slavapestov/factor/blob/master/basis/csv/csv.factor#L26
https://github.com/slavapestov/factor/blob/master/basis/csv/csv.factor#L31

Since `make` builds the sequence in a dynamically scoped variable, you can
split up your calls to `,` and `%` across your own factored out words, and
even have nested calls to `make`:

IN: scratchpad USE: make
IN: scratchpad : a, ( -- ) CHAR: a , ;
IN: scratchpad : b, ( -- ) CHAR: b , ;
IN: scratchpad : c, ( -- ) CHAR: c , ;
IN: scratchpad : make-string ( -- str ) [ a, b, c, ] "" make ;
IN: scratchpad : string, ( -- ) make-string , ;
IN: scratchpad [ string, a, string, b, string, c, ] { } make .
{ "abc" 97 "abc" 98 "abc" 99 }

(1) invoke custom command in shell:  cmd1 -a A -b B
>  The result is printed out to stdout.
> (2) Take the output of (1) and select (or manipulate) the lines
>   And print them to stdout
> (3) invoke another custom command: cmd2 -c C…
>

Ah, that makes more sense, thanks. I'm glad you found out that you can use
quotations in `run-pipeline` - I didn't even known that! So I learned
something. :)

My original issue was to construct the string “cmd1 -a A -b B” and “cmd2 -c
> C”
> in a flexible way so that I can choose to supply (or not supply) those
> argument A, B, C.
> And find a clean way to put everything together into { … } for
> run-pipeline.
>

Ah, yes, optional arguments are an interesting case. Given those
constraints, your  objects may indeed be the best way of constructing
the individual "cmd -flag1 value1 ..." strings, rather than `make`-ing raw
strings. Although `make` might still be useful for building the array
that's passed to `run-pipeline`.

Let us know how it goes!


On Tue, Sep 22, 2015 at 8:46 PM, HP Wei  wrote:

> Thanks, Alex, for pointing out the the ‘make’ word, which is something new
> to me.
> I will study the example usages that you listed.
>
> 
>
> I realize that I did not make my original intention clear enough.
> Here is what I want to do in factor:
>
> (1) invoke custom command in shell:  cmd1 -a A -b B
>  The result is printed out to stdout.
> (2) Take the output of (1) and select (or manipulate) the lines
>   And print them to stdout
> (3) invoke another custom command: cmd2 -c C…
>
> So, in factor, from what I leaned so far, to accomplish the above I can do
>
> { “cmd1 -a A -b B” [ quot ] “cmd2 -c C” } run-pipeline
>
> My original issue was to construct the string “cmd1 -a A -b B” and “cmd2
> -c C”
> in a flexible way so that I can choose to supply (or not supply) those
> argument A, B, C.
> And find a clean way to put everything together into { … } for
> run-pipeline.
>
> By the way,
> Your suggestion in another email of implementing >process looks
> interesting to explore.
> Any example usage in this case ?
>
> Thanks
> HP
>
>
> On Sep 22, 2015, at 12:23 PM, Alex Vondrak  wrote:
>
> Ultimately, I may also insert some factor quot in betweeen
>> str1 and str2 to do some processing before handing the
>> result to cmd2.
>
>
> Do you mean you want to take the output of running cmd1, manipulate it,
> then pass *that* to cmd2? Because that sounds rather different from what
> your example code looks like it's actually trying to do.
>
> It seems like your example is trying to construct launch descriptors
> independently, then pass those entire results to run-pipeline at once.
> Which is altogether easier: if I understand right, you're basically there
> already, but your main concern is more about how to build the array in a
> prettier way? If that's it, I suggest the `make` vocabulary:
> http://docs.factorcode.org/content/article-namespaces-make.html
>
> Some examples of `make` usage in the wild:
>
> https://github.com/slavapestov/factor/blob/master/basis/io/backend/unix/unix-tests.factor#L142-L147
>
> 

Re: [Factor-talk] how to run-pipeline

2015-09-22 Thread HP Wei
Thanks, Alex, for pointing out the the ‘make’ word, which is something new to 
me.
I will study the example usages that you listed.



I realize that I did not make my original intention clear enough.
Here is what I want to do in factor:

(1) invoke custom command in shell:  cmd1 -a A -b B
 The result is printed out to stdout.
(2) Take the output of (1) and select (or manipulate) the lines
  And print them to stdout
(3) invoke another custom command: cmd2 -c C…

So, in factor, from what I leaned so far, to accomplish the above I can do 

{ “cmd1 -a A -b B” [ quot ] “cmd2 -c C” } run-pipeline

My original issue was to construct the string “cmd1 -a A -b B” and “cmd2 -c C”
in a flexible way so that I can choose to supply (or not supply) those argument 
A, B, C.
And find a clean way to put everything together into { … } for run-pipeline.

By the way,
Your suggestion in another email of implementing >process looks interesting to 
explore.
Any example usage in this case ?

Thanks
HP
 

> On Sep 22, 2015, at 12:23 PM, Alex Vondrak  wrote:
> 
> Ultimately, I may also insert some factor quot in betweeen
> str1 and str2 to do some processing before handing the
> result to cmd2.
> 
> Do you mean you want to take the output of running cmd1, manipulate it, then 
> pass *that* to cmd2? Because that sounds rather different from what your 
> example code looks like it's actually trying to do.
> 
> It seems like your example is trying to construct launch descriptors 
> independently, then pass those entire results to run-pipeline at once. Which 
> is altogether easier: if I understand right, you're basically there already, 
> but your main concern is more about how to build the array in a prettier way? 
> If that's it, I suggest the `make` vocabulary: 
> http://docs.factorcode.org/content/article-namespaces-make.html 
> 
> 
> Some examples of `make` usage in the wild:
> https://github.com/slavapestov/factor/blob/master/basis/io/backend/unix/unix-tests.factor#L142-L147
>  
> 
> https://github.com/slavapestov/factor/blob/master/basis/bootstrap/image/upload/upload.factor#L47-L51
>  
> 
> https://github.com/slavapestov/factor/blob/master/extra/graphviz/render/render.factor#L62-L67
>  
> 
> 
> Granted, all of those are building a single process, not a pipeline. But the 
> same principles apply:
> 
> : cmd1 ( -- ) ... ;
> : cmd2 ( -- ) ... ;
> 
> [ cmd1 , cmd2 , ] { } make run-pipeline
> 
> On Mon, Sep 21, 2015 at 9:21 PM, HP Wei  > wrote:
> I want to run binary codes (C++) under linux using run-pipeline
> 
> In linux shell, the task is
> 
> cmd1 -a arg1 -b arg2 | cmd2 -c arg3
> 
> I know in general, in factor, I need to construct
> 
> { str1  str2 } run-pipeline
> where str1 = “cmd1 -a arg1 -b arg2”
>str2 = “cmd2 -c arg3”
> Ultimately, I may also insert some factor quot in betweeen
> str1 and str2 to do some processing before handing the
> result to cmd2.
> 
> 
> Here is what I envision:
> 
> TUPLE: cmd1 a b ;
> 
> :  ( — cmd1 )
> cmd1 new
> “default a” >>a
> “default b” >>b ;
> 
> : get-cmd1 ( cmd1 — str1 )
>[ a>> ] [ b>> ] bi
>“cmd1 -a %s -b %s” sprintf  ;
> 
> so now, I can write
> 
> 
>my_b >>b
> get-cmd1
> 
> — similarly for cmd2.
> 
> But I bump into a mental block when trying to
> put things together for run-pipeline
> 
> If there were just one cmd1 (without cmd2),
> I thought I could do
> 
> ${  my_b >>b get-cmd1 } run-pipeline
> 
> Adding cmd2, I could write
> 
> ${  my_b >>b get-cmd1   my_c >>c get-cmd2 } run-pipeline
> 
> But this looks ugly.
> Is there a simpler way ?
> 
> Thanks
> HP Wei
> 
> 
> 
> --
> ___
> Factor-talk mailing list
> Factor-talk@lists.sourceforge.net 
> https://lists.sourceforge.net/lists/listinfo/factor-talk 
> 
> 
> --
> ___
> Factor-talk mailing list
> Factor-talk@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/factor-talk

--
Monitor Your Dynamic Infrastructure at Any Scale With Datadog!
Get real-time metrics from all of your servers, apps and tools
in one place.
SourceForge users - Click here to start your Free Trial of Datadog now!

Re: [Factor-talk] how to run-pipeline

2015-09-22 Thread Alex Vondrak
>
> Ultimately, I may also insert some factor quot in betweeen
> str1 and str2 to do some processing before handing the
> result to cmd2.


Do you mean you want to take the output of running cmd1, manipulate it,
then pass *that* to cmd2? Because that sounds rather different from what
your example code looks like it's actually trying to do.

It seems like your example is trying to construct launch descriptors
independently, then pass those entire results to run-pipeline at once.
Which is altogether easier: if I understand right, you're basically there
already, but your main concern is more about how to build the array in a
prettier way? If that's it, I suggest the `make` vocabulary:
http://docs.factorcode.org/content/article-namespaces-make.html

Some examples of `make` usage in the wild:
https://github.com/slavapestov/factor/blob/master/basis/io/backend/unix/unix-tests.factor#L142-L147
https://github.com/slavapestov/factor/blob/master/basis/bootstrap/image/upload/upload.factor#L47-L51
https://github.com/slavapestov/factor/blob/master/extra/graphviz/render/render.factor#L62-L67

Granted, all of those are building a single process, not a pipeline. But
the same principles apply:

: cmd1 ( -- ) ... ;
: cmd2 ( -- ) ... ;

[ cmd1 , cmd2 , ] { } make run-pipeline

On Mon, Sep 21, 2015 at 9:21 PM, HP Wei  wrote:

> I want to run binary codes (C++) under linux using run-pipeline
>
> In linux shell, the task is
>
> cmd1 -a arg1 -b arg2 | cmd2 -c arg3
>
> I know in general, in factor, I need to construct
>
> { str1  str2 } run-pipeline
> where str1 = “cmd1 -a arg1 -b arg2”
>str2 = “cmd2 -c arg3”
> Ultimately, I may also insert some factor quot in betweeen
> str1 and str2 to do some processing before handing the
> result to cmd2.
>
>
> Here is what I envision:
>
> TUPLE: cmd1 a b ;
>
> :  ( — cmd1 )
> cmd1 new
> “default a” >>a
> “default b” >>b ;
>
> : get-cmd1 ( cmd1 — str1 )
>[ a>> ] [ b>> ] bi
>“cmd1 -a %s -b %s” sprintf  ;
>
> so now, I can write
>
> 
>my_b >>b
> get-cmd1
>
> — similarly for cmd2.
>
> But I bump into a mental block when trying to
> put things together for run-pipeline
>
> If there were just one cmd1 (without cmd2),
> I thought I could do
>
> ${  my_b >>b get-cmd1 } run-pipeline
>
> Adding cmd2, I could write
>
> ${  my_b >>b get-cmd1   my_c >>c get-cmd2 } run-pipeline
>
> But this looks ugly.
> Is there a simpler way ?
>
> Thanks
> HP Wei
>
>
>
>
> --
> ___
> Factor-talk mailing list
> Factor-talk@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/factor-talk
>
--
___
Factor-talk mailing list
Factor-talk@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/factor-talk


Re: [Factor-talk] how to run-pipeline

2015-09-22 Thread Alex Vondrak
You might also consider implementing the >process method for your custom
 /  objects, which will be called automatically when invoking
run-pipeline.
http://docs.factorcode.org/content/word-__gt__process,io.launcher.html Or
just do away with  /  objects altogether and build those as
strings/arrays with `make`, too.

On Tue, Sep 22, 2015 at 9:23 AM, Alex Vondrak  wrote:

> Ultimately, I may also insert some factor quot in betweeen
>> str1 and str2 to do some processing before handing the
>> result to cmd2.
>
>
> Do you mean you want to take the output of running cmd1, manipulate it,
> then pass *that* to cmd2? Because that sounds rather different from what
> your example code looks like it's actually trying to do.
>
> It seems like your example is trying to construct launch descriptors
> independently, then pass those entire results to run-pipeline at once.
> Which is altogether easier: if I understand right, you're basically there
> already, but your main concern is more about how to build the array in a
> prettier way? If that's it, I suggest the `make` vocabulary:
> http://docs.factorcode.org/content/article-namespaces-make.html
>
> Some examples of `make` usage in the wild:
>
> https://github.com/slavapestov/factor/blob/master/basis/io/backend/unix/unix-tests.factor#L142-L147
>
> https://github.com/slavapestov/factor/blob/master/basis/bootstrap/image/upload/upload.factor#L47-L51
>
> https://github.com/slavapestov/factor/blob/master/extra/graphviz/render/render.factor#L62-L67
>
> Granted, all of those are building a single process, not a pipeline. But
> the same principles apply:
>
> : cmd1 ( -- ) ... ;
> : cmd2 ( -- ) ... ;
>
> [ cmd1 , cmd2 , ] { } make run-pipeline
>
>
> On Mon, Sep 21, 2015 at 9:21 PM, HP Wei  wrote:
>
>> I want to run binary codes (C++) under linux using run-pipeline
>>
>> In linux shell, the task is
>>
>> cmd1 -a arg1 -b arg2 | cmd2 -c arg3
>>
>> I know in general, in factor, I need to construct
>>
>> { str1  str2 } run-pipeline
>> where str1 = “cmd1 -a arg1 -b arg2”
>>str2 = “cmd2 -c arg3”
>> Ultimately, I may also insert some factor quot in betweeen
>> str1 and str2 to do some processing before handing the
>> result to cmd2.
>>
>>
>> Here is what I envision:
>>
>> TUPLE: cmd1 a b ;
>>
>> :  ( — cmd1 )
>> cmd1 new
>> “default a” >>a
>> “default b” >>b ;
>>
>> : get-cmd1 ( cmd1 — str1 )
>>[ a>> ] [ b>> ] bi
>>“cmd1 -a %s -b %s” sprintf  ;
>>
>> so now, I can write
>>
>> 
>>my_b >>b
>> get-cmd1
>>
>> — similarly for cmd2.
>>
>> But I bump into a mental block when trying to
>> put things together for run-pipeline
>>
>> If there were just one cmd1 (without cmd2),
>> I thought I could do
>>
>> ${  my_b >>b get-cmd1 } run-pipeline
>>
>> Adding cmd2, I could write
>>
>> ${  my_b >>b get-cmd1   my_c >>c get-cmd2 } run-pipeline
>>
>> But this looks ugly.
>> Is there a simpler way ?
>>
>> Thanks
>> HP Wei
>>
>>
>>
>>
>> --
>> ___
>> Factor-talk mailing list
>> Factor-talk@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/factor-talk
>>
>
>
--
___
Factor-talk mailing list
Factor-talk@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/factor-talk


[Factor-talk] how to run-pipeline

2015-09-21 Thread HP Wei
I want to run binary codes (C++) under linux using run-pipeline

In linux shell, the task is 

cmd1 -a arg1 -b arg2 | cmd2 -c arg3

I know in general, in factor, I need to construct

{ str1  str2 } run-pipeline
where str1 = “cmd1 -a arg1 -b arg2”
   str2 = “cmd2 -c arg3”
Ultimately, I may also insert some factor quot in betweeen
str1 and str2 to do some processing before handing the 
result to cmd2.


Here is what I envision:

TUPLE: cmd1 a b ;

:  ( — cmd1 )
cmd1 new
“default a” >>a
“default b” >>b ;

: get-cmd1 ( cmd1 — str1 )
   [ a>> ] [ b>> ] bi 
   “cmd1 -a %s -b %s” sprintf  ;

so now, I can write

 
   my_b >>b
get-cmd1

— similarly for cmd2.

But I bump into a mental block when trying to 
put things together for run-pipeline

If there were just one cmd1 (without cmd2),
I thought I could do

${  my_b >>b get-cmd1 } run-pipeline

Adding cmd2, I could write

${  my_b >>b get-cmd1   my_c >>c get-cmd2 } run-pipeline

But this looks ugly.  
Is there a simpler way ?

Thanks
HP Wei



--
___
Factor-talk mailing list
Factor-talk@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/factor-talk