Re: bash question.

2012-04-13 Thread Hendrik Brueckner
Hi John,

On Thu, Apr 12, 2012 at 10:42:33PM -0500, John McKown wrote:
 bash has variables, such as $PATH and $HOME and maybe even $i. If a
 variable has been the subject of an export command, you find all of them
 which are export'd using the printenv command. But is there some way to
 find the ones which exist, but have not been export'd?

 No, I guess I don't have a real need for this. But I'm curious.

You can use declare -p.  Exported variables are displyed with -x,
for example: declare -x EDITOR=vim

Take a look at help declare for other options like if a variable is an
array...

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2012-04-13 Thread McKown, John
I must learn how to read the manuals. Yes, I did do an info bash. It's nice 
and easy to use to find what a command does, if you know the command. But it is 
difficult, at least for me, to find a command based on what it does.

-- 
John McKown 
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone * 
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM

 -Original Message-
 From: Linux on 390 Port [mailto:LINUX-390@VM.MARIST.EDU] On 
 Behalf Of Mark Post
 Sent: Thursday, April 12, 2012 10:48 PM
 To: LINUX-390@VM.MARIST.EDU
 Subject: Re: bash question.
 
  On 4/12/2012 at 11:42 PM, John McKown joa...@swbell.net wrote: 
  bash has variables, such as $PATH and $HOME and maybe even $i. If a
  variable has been the subject of an export command, you 
 find all of them
  which are export'd using the printenv command. But is there 
 some way to
  find the ones which exist, but have not been export'd?
 
 I use env or set.
 
 
 Mark Post
 
 --
 For LINUX-390 subscribe / signoff / archive access instructions,
 send email to lists...@vm.marist.edu with the message: INFO 
 LINUX-390 or visit
 http://www.marist.edu/htbin/wlvindex?LINUX-390
 --
 For more information on Linux on System z, visit
 http://wiki.linuxvm.org/
 
 
--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2012-04-13 Thread McKown, John
Very nice! Thanks. I guess that I'm going to end up dedicating a weekend day to 
just read the entire output from info bash. Luckily, I can create a text file 
from it, convert it to PDF format, then read the PDF directly on my Kindle DX 
or Android tablet. I really don't like reading manuals on my PC screen 
because it is landscape instead of portrait. 

-- 
John McKown 
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone * 
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM

 

 -Original Message-
 From: Linux on 390 Port [mailto:LINUX-390@VM.MARIST.EDU] On 
 Behalf Of Hendrik Brueckner
 Sent: Friday, April 13, 2012 3:47 AM
 To: LINUX-390@VM.MARIST.EDU
 Subject: Re: bash question.
 
 Hi John,
 
 On Thu, Apr 12, 2012 at 10:42:33PM -0500, John McKown wrote:
  bash has variables, such as $PATH and $HOME and maybe even $i. If a
  variable has been the subject of an export command, you 
 find all of them
  which are export'd using the printenv command. But is there 
 some way to
  find the ones which exist, but have not been export'd?
 
  No, I guess I don't have a real need for this. But I'm curious.
 
 You can use declare -p.  Exported variables are displyed with -x,
 for example: declare -x EDITOR=vim
 
 Take a look at help declare for other options like if a 
 variable is an
 array...
 
 --
 For LINUX-390 subscribe / signoff / archive access instructions,
 send email to lists...@vm.marist.edu with the message: INFO 
 LINUX-390 or visit
 http://www.marist.edu/htbin/wlvindex?LINUX-390
 --
 For more information on Linux on System z, visit
 http://wiki.linuxvm.org/
 
 
--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2012-04-13 Thread Malcolm Beattie
McKown, John writes:
 Very nice! Thanks. I guess that I'm going to end up dedicating a
 weekend day to just read the entire output from info bash. Luckily,
 I can create a text file from it, convert it to PDF format, then
 read the PDF directly on my Kindle DX or Android tablet.

In case you weren't aware of it already, the utilities used to
process the *roff macros used in man pages support typesetting to
PostScript as well as generating simple text output. So typing
man -t bash  bash-man.ps
will generate you a nicely formatted PostScript version of the
man page in bash-man.ps, fancy fonts and all, instead of what you'd
get from just taking the text version. That's suitable for direct
pringting but you can instead just
  ps2pdf bash-man.ps
to produce your bash-man.pdf PDF version.

Using info bash instead of man bash uses a slightly different
source of documentation (the FSF document their own programs in their
own GNU info format instead of man pages) but you'll nearly always
find that some nice people have already ensured that your distro has
man pages for the programs as well and that they have either exactly
the same content or are close enough for most purposes. There are
ways of generating various typeset-like formats from info format too
but I forget what they are and I don't think they are as simple as
just adding -t to your man command invocation.

--Malcolm

--
Malcolm Beattie
Mainframe Systems and Software Business, Europe
IBM UK

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2012-04-13 Thread McKown, John
Oh, that is beautiful! Thank you very much.

--
John McKown 
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone * 
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM

 -Original Message-
 From: Linux on 390 Port [mailto:LINUX-390@VM.MARIST.EDU] On 
 Behalf Of Malcolm Beattie
 Sent: Friday, April 13, 2012 11:01 AM
 To: LINUX-390@VM.MARIST.EDU
 Subject: Re: bash question.
snip
 In case you weren't aware of it already, the utilities used to
 process the *roff macros used in man pages support typesetting to
 PostScript as well as generating simple text output. So typing
 man -t bash  bash-man.ps
 will generate you a nicely formatted PostScript version of the
 man page in bash-man.ps, fancy fonts and all, instead of what you'd
 get from just taking the text version. That's suitable for direct
 pringting but you can instead just
   ps2pdf bash-man.ps
 to produce your bash-man.pdf PDF version.
 
 Using info bash instead of man bash uses a slightly different
 source of documentation (the FSF document their own programs in their
 own GNU info format instead of man pages) but you'll nearly always
 find that some nice people have already ensured that your distro has
 man pages for the programs as well and that they have either exactly
 the same content or are close enough for most purposes. There are
 ways of generating various typeset-like formats from info format too
 but I forget what they are and I don't think they are as simple as
 just adding -t to your man command invocation.
 
 --Malcolm
 
 --
 Malcolm Beattie

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


bash question.

2012-04-12 Thread John McKown
bash has variables, such as $PATH and $HOME and maybe even $i. If a
variable has been the subject of an export command, you find all of them
which are export'd using the printenv command. But is there some way to
find the ones which exist, but have not been export'd?

No, I guess I don't have a real need for this. But I'm curious.

--
John McKown
Maranatha! 

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2012-04-12 Thread Mark Post
 On 4/12/2012 at 11:42 PM, John McKown joa...@swbell.net wrote: 
 bash has variables, such as $PATH and $HOME and maybe even $i. If a
 variable has been the subject of an export command, you find all of them
 which are export'd using the printenv command. But is there some way to
 find the ones which exist, but have not been export'd?

I use env or set.


Mark Post

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread McKown, John
Yeah, it sound weird. What I have is 72 files containing a lot of secuity data 
from our z/OS RACF system. To save space, all these files are bzip2'ed - each 
individually. I am writing some Perl scripts to process this data. The Perl 
script basically reformats the data in such a way that I can put it easily into 
a PostgreSQL database. If I want to run each Perl script individually, it is 
simple:

bzcat data*bz2 | perl script1.pl | psql database
bzcat data*bz2 | perl script2.pl | psql database

and so on. I don't want to try to merge the scripts together into a single, 
complicated, script. I like what I have in that regard. But I don't like 
running the bzcat twice to feed into each Pel script. Is something like the 
following possible?

mkfifo script1.fifo
mkfifo script2.fifo
bzcat data*bz2 | tee script1.fifo script2.fifo 
perl script1.pl script1.fifo 
perl script2.pl script2.fifo 

???

What about more than two scripts concurrently? What about n scripts?

John McKown
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone *
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM


--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread Larry Ploetz
If you have a new enough bash, you can:

bzcat data*bz2 | tee (process1) (process2) (process3) ... | processn

but the stdout from process1..n get intermixed unless redirected to files.

(Tom Meyer taught me that!)

- Larry

On 2/9/11 12:19 PM, McKown, John wrote:
 Yeah, it sound weird. What I have is 72 files containing a lot of secuity 
 data from our z/OS RACF system. To save space, all these files are bzip2'ed - 
 each individually. I am writing some Perl scripts to process this data. The 
 Perl script basically reformats the data in such a way that I can put it 
 easily into a PostgreSQL database. If I want to run each Perl script 
 individually, it is simple:

 bzcat data*bz2 | perl script1.pl | psql database
 bzcat data*bz2 | perl script2.pl | psql database

 and so on. I don't want to try to merge the scripts together into a single, 
 complicated, script. I like what I have in that regard. But I don't like 
 running the bzcat twice to feed into each Pel script. Is something like the 
 following possible?

 mkfifo script1.fifo
 mkfifo script2.fifo
 bzcat data*bz2 | tee script1.fifo script2.fifo 
 perl script1.pl script1.fifo 
 perl script2.pl script2.fifo 

 ???

 What about more than two scripts concurrently? What about n scripts?

 John McKown
 Systems Engineer IV
 IT


--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread McKown, John
tee can output to multiple files? The man page implies only a single file. So I 
guess I could:

bzcat data*bz2| tee (perl script1.pl|psql database)|perl script2.pl|psql 
database

bzcat data*bz2| tee (perl script1.pl|psql database)|tee (pers script2.pl|psql 
database)|perl script3.pl|psql databaselin

for the 2 case. I also guess that I might be able to cascade tee processes. But 
that's getting nasty, too.

--
John McKown 
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone * 
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM

 

 -Original Message-
 From: Larry Ploetz [mailto:la...@stanford.edu] 
 Sent: Wednesday, February 09, 2011 2:31 PM
 To: Linux on 390 Port
 Cc: McKown, John
 Subject: Re: BASH question - may even be advanced - pipe 
 stdout to 2 or more processes.
 
 If you have a new enough bash, you can:
 
 bzcat data*bz2 | tee (process1) (process2) (process3) ... 
 | processn
 
 but the stdout from process1..n get intermixed unless 
 redirected to files.
 
 (Tom Meyer taught me that!)
 
 - Larry
 
 On 2/9/11 12:19 PM, McKown, John wrote:
  Yeah, it sound weird. What I have is 72 files containing a 
 lot of secuity data from our z/OS RACF system. To save space, 
 all these files are bzip2'ed - each individually. I am 
 writing some Perl scripts to process this data. The Perl 
 script basically reformats the data in such a way that I can 
 put it easily into a PostgreSQL database. If I want to run 
 each Perl script individually, it is simple:
 
  bzcat data*bz2 | perl script1.pl | psql database
  bzcat data*bz2 | perl script2.pl | psql database
 
  and so on. I don't want to try to merge the scripts 
 together into a single, complicated, script. I like what I 
 have in that regard. But I don't like running the bzcat twice 
 to feed into each Pel script. Is something like the following 
 possible?
 
  mkfifo script1.fifo
  mkfifo script2.fifo
  bzcat data*bz2 | tee script1.fifo script2.fifo 
  perl script1.pl script1.fifo 
  perl script2.pl script2.fifo 
 
  ???
 
  What about more than two scripts concurrently? What about 
 n scripts?
 
  John McKown
  Systems Engineer IV
  IT
 
 
 
 
--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread Larry Ploetz
On 2/9/11 12:40 PM, McKown, John wrote:
 tee can output to multiple files? The man page implies only a single file.

Hmmm...maybe you need a new enough tee also:

SYNOPSIS
   tee [OPTION]... [FILE]...

DESCRIPTION
   Copy standard input to each FILE, and also to standard output.


 So I guess I could:

 bzcat data*bz2| tee (perl script1.pl|psql database)|perl script2.pl|psql 
 database

 bzcat data*bz2| tee (perl script1.pl|psql database)|tee (pers 
 script2.pl|psql database)|perl script3.pl|psql databaselin

Yep, those should work also. The named pipe would work too, if you did 
something like:

mkfifo p1 p2 p3

bzcat data*bz2 | tee p1 | tee p2  p3 

perl script1.pl  p1
perl script2.pl  p2
perl script3.pl  p3

(make sure and background or run in another terminal the bzcat.)
 for the 2 case. I also guess that I might be able to cascade tee processes. 
 But that's getting nasty, too.

 --
 John McKown
 Systems Engineer IV
 IT

- Larry

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread Edmund R. MacKenty
On Wednesday, February 09, 2011 03:19:03 pm you wrote:
 Yeah, it sound weird. What I have is 72 files containing a lot of secuity
 data from our z/OS RACF system. To save space, all these files are
 bzip2'ed - each individually. I am writing some Perl scripts to process
 this data. The Perl script basically reformats the data in such a way that
 I can put it easily into a PostgreSQL database. If I want to run each Perl
 script individually, it is simple:

 bzcat data*bz2 | perl script1.pl | psql database
 bzcat data*bz2 | perl script2.pl | psql database

 and so on. I don't want to try to merge the scripts together into a single,
 complicated, script. I like what I have in that regard. But I don't like
 running the bzcat twice to feed into each Pel script. Is something like
 the following possible?

 mkfifo script1.fifo
 mkfifo script2.fifo
 bzcat data*bz2 | tee script1.fifo script2.fifo 
 perl script1.pl script1.fifo 
 perl script2.pl script2.fifo 

 ???

 What about more than two scripts concurrently? What about n scripts?

Using tee is the right approach, and the above should work OK.  Solving this
problem for N outputs is a bit trickier, because you have to have something
that copies its input N times.  That could be done with a shell loop.  Here's
a function that copies its stdin to each of the files named on its command
line:

Ntee() {
while read line; do
for file; do
echo $line  $file
done
done
}

Well, that does it, but it is opening each file and seeking to its end for
each line of input, and that's pretty inefficient.  What we'd like to  do is
keep the files open.  Something like this might do it, but I haven't tested
it:

Ntee() {
fd=3
for file; do
eval $fd'$file'
fd=$((fd + 1))
done
while read line; do
fd=3
for file; do
eval 'echo $line 1'$fd
fd=$((fd + 1))
done
done
}

The first for-loop opens all the files and assigns file descriptors to them,
and the second for-loop writes to those open file descriptors.  The eval is
used to expand the $fd (the rest of the command is protected from evaluation
be single-quotes) because the file-redirection syntax requires a number.  So,
for example, the first time around the first loop, the command:
3$file
is what gets executed.

I haven't tried to run this, but the idea might help.
- MacK.
-
Edmund R. MacKenty
Software Architect
Rocket Software
275 Grove Street  -  Newton, MA 02466-2272  -  USA
Tel: +1.617.614.4321
Email: m...@rs.com
Web: www.rocketsoftware.com

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread McKown, John
No, it's me. I didn't interpret [FILE]... as more than one file. My bad.

I think this will handle what I want to do. Many thanks!

--
John McKown 
Systems Engineer IV
IT

Administrative Services Group

HealthMarkets(r)

9151 Boulevard 26 * N. Richland Hills * TX 76010
(817) 255-3225 phone * 
john.mck...@healthmarkets.com * www.HealthMarkets.com

Confidentiality Notice: This e-mail message may contain confidential or 
proprietary information. If you are not the intended recipient, please contact 
the sender by reply e-mail and destroy all copies of the original message. 
HealthMarkets(r) is the brand name for products underwritten and issued by the 
insurance subsidiaries of HealthMarkets, Inc. -The Chesapeake Life Insurance 
Company(r), Mid-West National Life Insurance Company of TennesseeSM and The 
MEGA Life and Health Insurance Company.SM

 

 -Original Message-
 From: Linux on 390 Port [mailto:LINUX-390@VM.MARIST.EDU] On 
 Behalf Of Larry Ploetz
 Sent: Wednesday, February 09, 2011 2:48 PM
 To: LINUX-390@VM.MARIST.EDU
 Subject: Re: BASH question - may even be advanced - pipe 
 stdout to 2 or more processes.
 
 On 2/9/11 12:40 PM, McKown, John wrote:
  tee can output to multiple files? The man page implies only 
 a single file.
 
 Hmmm...maybe you need a new enough tee also:
 
 SYNOPSIS
tee [OPTION]... [FILE]...
 
 DESCRIPTION
Copy standard input to each FILE, and also to standard output.
 
 
  So I guess I could:
 
  bzcat data*bz2| tee (perl script1.pl|psql database)|perl 
 script2.pl|psql database
 
  bzcat data*bz2| tee (perl script1.pl|psql database)|tee 
 (pers script2.pl|psql database)|perl script3.pl|psql databaselin
 
 Yep, those should work also. The named pipe would work too, 
 if you did something like:
 
 mkfifo p1 p2 p3
 
 bzcat data*bz2 | tee p1 | tee p2  p3 
 
 perl script1.pl  p1
 perl script2.pl  p2
 perl script3.pl  p3
 
 (make sure and background or run in another terminal the bzcat.)
  for the 2 case. I also guess that I might be able to 
 cascade tee processes. But that's getting nasty, too.
 
  --
  John McKown
  Systems Engineer IV
  IT
 
 - Larry
 
 --
 For LINUX-390 subscribe / signoff / archive access instructions,
 send email to lists...@vm.marist.edu with the message: INFO 
 LINUX-390 or visit
 http://www.marist.edu/htbin/wlvindex?LINUX-390
 --
 For more information on Linux on System z, visit
 http://wiki.linuxvm.org/
 
 
--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread Edmund R. MacKenty
On Wednesday, February 09, 2011 03:47:38 pm you wrote:
 On 2/9/11 12:40 PM, McKown, John wrote:
  tee can output to multiple files? The man page implies only a single
  file.

 Hmmm...maybe you need a new enough tee also:

 SYNOPSIS
tee [OPTION]... [FILE]...

 DESCRIPTION
Copy standard input to each FILE, and also to standard output.

Doh!  I should have remembered that.  So the functions I wrote could have been
implemented as:

Ntee() {
tee $@ /dev/null
}

Just goes to show that there's usually several ways to do anything in Linux.
I focused on doing it entirely in bash.
- MacK.
-
Edmund R. MacKenty
Software Architect
Rocket Software
275 Grove Street  -  Newton, MA 02466-2272  -  USA
Tel: +1.617.614.4321
Email: m...@rs.com
Web: www.rocketsoftware.com

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: BASH question - may even be advanced - pipe stdout to 2 or more processes.

2011-02-09 Thread Larry Ploetz
On 2/9/11 1:08 PM, Edmund R. MacKenty wrote:

 Doh!  I should have remembered that.  So the functions I wrote could have been
 implemented as:

 Ntee() {
 tee $@ /dev/null
 }

 Just goes to show that there's usually several ways to do anything in Linux.
 I focused on doing it entirely in bash.
   - MacK.

Which is vaguely related to the perl mentality of never making a system call 
and always finding
and using, or writing your own, module (that also doesn't make a system call). 
I'm sure perl has
tee-like functions (actually, I remember a perl script named tee2 that did 
write to multiple
output files), versus the more traditional (?) unix mentality of using many 
small,
does-one-function-and-does-it-well utilities available in the environment, tied 
together perhaps
with a shell script. I prefer the latter, but I can understand the portability 
issues and the
perl-only solution to them.

- Larry

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390
--
For more information on Linux on System z, visit
http://wiki.linuxvm.org/


Re: bash question.

2009-01-11 Thread John Summerfield

John McKown wrote:

Is there any better way, in a bash script, to pipe both stdout and stderr
from an application other than using a subshell? So far the only way that
I've thought of to do it is:

(command parm1 ... 21) | othercommand


The parentheses don't do anything useful. Here, stderr is discarded  (by
cat) in the first example, but not the second.

01:46 [sum...@numbat ~]$ find ~ -size +400c 21 | cat /dev/null
01:47 [sum...@numbat ~]$ find ~ -size +400c  | cat /dev/null
find: /home/summer/newstocks: Permission denied
find: /home/summer/mystocks: Permission denied
find: /home/summer/stocks: Permission denied
01:47 [sum...@numbat ~]$



--

Cheers
John

-- spambait
1...@coco.merseine.nu  z1...@coco.merseine.nu
-- Advice
http://webfoot.com/advice/email.top.php
http://www.catb.org/~esr/faqs/smart-questions.html
http://support.microsoft.com/kb/555375

You cannot reply off-list:-)

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


bash question.

2009-01-08 Thread John McKown
Is there any better way, in a bash script, to pipe both stdout and stderr
from an application other than using a subshell? So far the only way that
I've thought of to do it is:

(command parm1 ... 21) | othercommand

--
Q: What do theoretical physicists drink beer from?
A: Ein Stein.

Maranatha!
John McKown

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread Mark Post
 On 1/8/2009 at 12:36 PM, John McKown joa...@swbell.net wrote: 
 Is there any better way, in a bash script, to pipe both stdout and stderr
 from an application other than using a subshell? So far the only way that
 I've thought of to do it is:
 
 (command parm1 ... 21) | othercommand

Just:
command parm1 21 | othercommand

should work.  If it doesn't for you, can you provide a specific example?


Mark Post

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread John McKown
On Thu, 8 Jan 2009, Mark Post wrote:

  On 1/8/2009 at 12:36 PM, John McKown joa...@swbell.net wrote:
  Is there any better way, in a bash script, to pipe both stdout and stderr
  from an application other than using a subshell? So far the only way that
  I've thought of to do it is:
 
  (command parm1 ... 21) | othercommand

 Just:
 command parm1 21 | othercommand

 should work.  If it doesn't for you, can you provide a specific example?

 Mark Post

Well, shoot. That never even occurred to me. What I thought that would do
was:

Change stderr to go where stdout currently goes, then change stdout to go
into the pipe. I based this on the fact that if I do:

command 21 1x.tmp

Then stderr still comes to my terminal. It does not go to x.tmp.  I guess
there is some special code in bash to recognize the redirection  piping
as special.

--
Q: What do theoretical physicists drink beer from?
A: Ein Stein.

Maranatha!
John McKown

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread Edmund R. MacKenty
On Thursday 08 January 2009 14:13, John McKown wrote:
Well, shoot. That never even occurred to me. What I thought that would do
was:

Change stderr to go where stdout currently goes, then change stdout to go
into the pipe. I based this on the fact that if I do:

command 21 1x.tmp

Then stderr still comes to my terminal. It does not go to x.tmp.  I guess
there is some special code in bash to recognize the redirection  piping
as special.

Actually, there's no special case for this.  The rule is that the shell 
processes I/O redirections left-to-right.  The 21 syntax just means make 
file descriptor 2 (stderr) refer to whatever file descriptor 1 (stdout) 
refers to.  It doesn't change stdout at all.  File descriptor 1 already 
refers to the pipe because the shell creates the pipes as it is parsing the 
pipeline, before it parses the simple commands within the pipeline.

I hope that makes sense! :-)
- MacK.
-
Edmund R. MacKenty
Software Architect
Rocket Software
275 Grove Street · Newton, MA 02466-2272 · USA
Tel: +1.617.614.4321
Email: m...@rs.com
Web: www.rocketsoftware.com  

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread Erik N Johnson
Thus, if you want the behaviour you described in the above command,
you could do the following:

command parms 21 | cat file.tmp

which would put everything in the file file.tmp

Erik Johnson

On Thu, Jan 8, 2009 at 1:30 PM, Edmund R. MacKenty
ed.macke...@rocketsoftware.com wrote:
 On Thursday 08 January 2009 14:13, John McKown wrote:
Well, shoot. That never even occurred to me. What I thought that would do
was:

Change stderr to go where stdout currently goes, then change stdout to go
into the pipe. I based this on the fact that if I do:

command 21 1x.tmp

Then stderr still comes to my terminal. It does not go to x.tmp.  I guess
there is some special code in bash to recognize the redirection  piping
as special.

 Actually, there's no special case for this.  The rule is that the shell
 processes I/O redirections left-to-right.  The 21 syntax just means make
 file descriptor 2 (stderr) refer to whatever file descriptor 1 (stdout)
 refers to.  It doesn't change stdout at all.  File descriptor 1 already
 refers to the pipe because the shell creates the pipes as it is parsing the
 pipeline, before it parses the simple commands within the pipeline.

 I hope that makes sense! :-)
- MacK.
 -
 Edmund R. MacKenty
 Software Architect
 Rocket Software
 275 Grove Street · Newton, MA 02466-2272 · USA
 Tel: +1.617.614.4321
 Email: m...@rs.com
 Web: www.rocketsoftware.com

 --
 For LINUX-390 subscribe / signoff / archive access instructions,
 send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
 http://www.marist.edu/htbin/wlvindex?LINUX-390


--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread Larry Ploetz

On 1/8/09 9:36 AM, John McKown wrote:

Is there any better way, in a bash script, to pipe both stdout and stderr
from an application other than using a subshell? So far the only way that
I've thought of to do it is:

(command parm1 ... 21) | othercommand



Others have answered the question, but in case you want to know -- the
technique (not the syntax!) you're using with the subshell is required
in csh and derivative shells (e.g., tcsh). From the tcsh man page:

   The  shell  cannot  presently  redirect  diagnostic output
without also
   redirecting standard output, but `(command   output-file) 
error-
   file'  is often an acceptable workaround.  Either output-file or
error-
   file may be `/dev/tty' to send output to the terminal.

You may have picked up the habit from too much exposure to Solaris? ;-)

- Larry

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390


Re: bash question.

2009-01-08 Thread John McKown
On Thu, 8 Jan 2009, Edmund R. MacKenty wrote:

 On Thursday 08 January 2009 14:13, John McKown wrote:
 Well, shoot. That never even occurred to me. What I thought that would do
 was:
 
 Change stderr to go where stdout currently goes, then change stdout to go
 into the pipe. I based this on the fact that if I do:
 
 command 21 1x.tmp
 
 Then stderr still comes to my terminal. It does not go to x.tmp.  I guess
 there is some special code in bash to recognize the redirection  piping
 as special.

 Actually, there's no special case for this.  The rule is that the shell
 processes I/O redirections left-to-right.  The 21 syntax just means make
 file descriptor 2 (stderr) refer to whatever file descriptor 1 (stdout)
 refers to.  It doesn't change stdout at all.  File descriptor 1 already
 refers to the pipe because the shell creates the pipes as it is parsing the
 pipeline, before it parses the simple commands within the pipeline.

 I hope that makes sense! :-)
   - MacK.

Ah! The little light goes on. That makes perfect sense. I was simply
parsing the command from left to right, so that the redirection happened
before the pipe, in my mind.

--
Q: What do theoretical physicists drink beer from?
A: Ein Stein.

Maranatha!
John McKown

--
For LINUX-390 subscribe / signoff / archive access instructions,
send email to lists...@vm.marist.edu with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390