I believe this should work (disclaimer - I haven't tried it)

Assuming you are JES2 then setup the DD with SPIN=(UNALLOC,CMNDONLY)

Then with automation set the time you want to spin and issue the command:

        $TS ####,SPIN,DDNAME=ddname

Then have automation initiate the process to extract the spool data 

Note this does not work with JES3

--------------------------------------------------------------------------
Lionel B. Dyck 
Mainframe Systems Programmer - TRA
Enterprise Operations (Station 200) (005OP6.3.10)
Information and Technology, IT Operations and Services


-----Original Message-----
From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] On Behalf 
Of venkat kulkarni
Sent: Thursday, May 04, 2017 7:58 AM
To: IBM-MAIN@LISTSERV.UA.EDU
Subject: [EXTERNAL] Re: job output into dataset

Hello,

Thanks for reply. As you mentioned that you have program which can be used to 
extract data from various jobs. There are couple points, I would like to make



1)     Our requirement is to avoid space issue by cutting the records from
continuously running address spaces and put in dataset.

2)     This process should run once in day and whatever address space we
specify in this process, should cut records from address space and keep 
appending into datasets we specify.

3)     For every address space, we will have separate Dataset for later to
be used or reviewed.


On 03-May-2017 6:07 PM, "Barkow, Eileen" <ebar...@doitt.nyc.gov> wrote:

> SDSF has a REXX interface which is documented in the SDSF OPERATION 
> and CUSTOMIZATION manual.
>
> I have a clist that extracts certain SYSOUT datasets from various  
> jobs on the output queue and writes them out to datasets.
> I  can send you the routine if that would help.
>
> -----Original Message-----
> From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] 
> On Behalf Of venkat kulkarni
> Sent: Wednesday, May 03, 2017 7:14 AM
> To: IBM-MAIN@LISTSERV.UA.EDU
> Subject: Re: job output into dataset
>
> Thanks for all suggestion. Currently, we don’t have any third party 
> product for doing this task. So,  my idea is to create rexx program, 
> which keep checking ( once in day) particular address spaces like IMS, 
> CICS, or any other which are continuously running in system and find 
> that if the lines are exceed in the job with the limit we set in rexx 
> program then this program should cut those messages from that address 
> space and save it in one separate dataset
>
> we specified in rexx program and next time when this program find more 
> message in the address space, it should again cut messages from 
> address space and append in the dataset we used earlier.
>
>
>
> So, for difference address space( CICS, IMS, DB2 etc) , I would like 
> to use different dataset and keep appending messages on regular interval .
>
>
>
> As I am not expert in rexx, can anybody help me doing this task.
>
>
>
> Regards
>
> On 03-May-2017 12:40 AM, "Lizette Koehler" <stars...@mindspring.com< 
> mailto:stars...@mindspring.com>> wrote:
>
> > So it might be possible that you are seeing data that JES2 has buffered.
> > That is why you can see all of the data on a job in the ST panel 
> > even if all that is left in O or H panels is the output from an IEBGENER.
> >
> > Lizette
> >
> > -----Original Message-----
> > >From: "van der Grijn, Bart (B)" <bvandergr...@dow.com<mailto:B
> vandergr...@dow.com>>
> > >Sent: May 2, 2017 2:26 PM
> > >To: IBM-MAIN@LISTSERV.UA.EDU<mailto:IBM-MAIN@LISTSERV.UA.EDU>
> > >Subject: Re: job output into dataset
> > >
> > >It works pretty well except for one (minor) issue. When you do SE
> against
> > a DDNAME in SDSF it doesn't work as expected. It seem to display the
> first
> > of the outputs with the same DDNAME, independent of which one you select.
> > At least, that's what it does here (z/OS 2.1).
> > >Bart
> > >
> > >-----Original Message-----
> > >From: IBM Mainframe Discussion List 
> > >[mailto:IBM-MAIN@LISTSERV.UA.EDU]
> On
> > Behalf Of Phil Sidler
> > >Sent: Tuesday, May 02, 2017 4:16 PM
> > >To: IBM-MAIN@LISTSERV.UA.EDU<mailto:IBM-MAIN@LISTSERV.UA.EDU>
> > >Subject: Re: job output into dataset
> > >
> > >On Tue, 2 May 2017 12:54:35 -0700, Lizette Koehler <
> > stars...@mindspring.com<mailto:stars...@mindspring.com>> wrote:
> > >
> > >>I have this in place on several Products I support. The JES 
> > >>Messages
> all
> > go to our output repository.  No issues.
> > >
> > >Oh, I see it now, $HASP138 and a new DD with the same name shows up.
> > Cool.
> > >
> >
> > --------------------------------------------------------------------
> > -- For IBM-MAIN subscribe / signoff / archive access instructions, 
> > send email to 
> > lists...@listserv.ua.edu<mailto:lists...@listserv.ua.edu>
> with the message: INFO IBM-MAIN
> >
>
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions, send 
> email to lists...@listserv.ua.edu<mailto:lists...@listserv.ua.edu>
> with the message: INFO IBM-MAIN
>
>
>
>   ________________________________
>
> This e-mail, including any attachments, may be confidential, 
> privileged or otherwise legally protected. It is intended only for the 
> addressee. If you received this e-mail in error or from someone who 
> was not authorized to send it to you, do not disseminate, copy or 
> otherwise use this e-mail or its attachments. Please notify the sender 
> immediately by reply e-mail and delete the e-mail from your system.
>
>
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions, send 
> email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN
>

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to 
lists...@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to