Re: Large SVCDUMPS

2011-11-01 Thread Longnecker, Dennis
Thanks all.teach me for not keeping up with JCL parameters and knowing 
DSNTYPE=LARGE was actually a JCL option!

Putting
//  DSNTYPE=LARGE, 
In my allocation did the trick!

Dennis

-Original Message-
From: IBM Mainframe Discussion List [mailto:IBM-MAIN@bama.ua.edu] On Behalf Of 
Longnecker, Dennis
Sent: Monday, October 31, 2011 10:04 AM
To: IBM-MAIN@bama.ua.edu
Subject: Large SVCDUMPS

Can someone point me to the right place to find the documentation on how to 
create large SVCDUMP datasets?   I need to create a large dump dataset so I can 
take a console dump of DB2.  I keep getting partial dumps.  I see in the 
messages the maximum size of a dataset is 65,535 tracks which for a 3390 is 
4369 cylinders.   In the z/OS MVS diagnosis: Tools and Service Aids manual they 
say "IBM Recommends using extended format sequential data sets as dump data 
sets for SVC dumps..and they can hold 128 gigabytes".But the manual 
doesn't give any clues on how to create them.

IBM APAR II06335 talks about the subject also:  " Note: With DFSMS120 and 
Dynamic Dump Allocation, multi-volume EFDS format datasets can be created for 
your SVCDUMP. "  but doesn't give any clues on how to create such a beast 
either.

I've tried created the SYS1.DUMP dataset on a MOD 9 with the contig option, but 
it stops at the 65,535 tracks.

Any pointers appreciated.

Dennis

--
For IBM-MAIN subscribe / signoff / archive access instructions, send email to 
lists...@bama.ua.edu with the message: GET IBM-MAIN INFO Search the archives at 
http://bama.ua.edu/archives/ibm-main.html

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-11-01 Thread Shmuel Metz (Seymour J.)
In
<351fdc6ace57d14b94beb5529e85df8e91b04b7...@exchmail1.courts.wa.gov>,
on 10/31/2011
   at 10:04 AM, "Longnecker, Dennis" 
said:

>Can someone point me to the right place to find the documentation on
>how to create large SVCDUMP datasets?

SMS. I don't know whether SDUMP supports EAV, but old fashioned
extended is good enough for mod 9.
 
-- 
 Shmuel (Seymour J.) Metz, SysProg and JOAT
 ISO position; see  
We don't care. We don't have to care, we're Congress.
(S877: The Shut up and Eat Your spam act of 2003)

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Jim Mulder
> IBM Mainframe Discussion List 
> 
> On 10/31/11 13:04, Longnecker, Dennis wrote:
> > Can someone point me to the right place to find the documentation 
> on how to create large SVCDUMP datasets?   I need to create a large 
> dump dataset so I can take a console dump of DB2.  I keep getting 
> partial dumps.  I see in the messages the maximum size of a dataset 
> is 65,535 tracks which for a 3390 is 4369 cylinders.   In the z/OS 
> MVS diagnosis: Tools and Service Aids manual they say "IBM 
> Recommends using extended format sequential data sets as dump data 
> sets for SVC dumps..and they can hold 128 gigabytes".But the
> manual doesn't give any clues on how to create them.
> >
> > IBM APAR II06335 talks about the subject also:  " Note: With 
> DFSMS120 and Dynamic Dump Allocation, multi-volume EFDS format 
> datasets can be created for your SVCDUMP. "  but doesn't give any 
> clues on how to create such a beast either.
> >
> > I've tried created the SYS1.DUMP dataset on a MOD 9 with the 
> contig option, but it stops at the 65,535 tracks.
> >
> > Any pointers appreciated.
> >
> > Dennis
> >
> >
> > 
> You specify whether a dataset is allocated in extended format in it's 
> data class. I don't know off hand if SVCDUMP processing supports 
> DSNTYPE=LARGE, but if so that would be another option.
> 
> -- 
> Mark Jacobs
> Time Customer Service
> Tampa, FL

  In z/OS 1.13, SDUMP specifies the DSNTYPE=LARGE text unit on
its SVC 99 parameters if the dump data set needs to be 
larger than 64K tracks. 

  Prior to z/OS 1.13, you can use Extended Format Sequential 
data sets for large dumps.  Such data sets must be SMS managed,
and you would make the request for Extended Format in your 
SMS ACS routines. 
 
Jim Mulder   z/OS System Test   IBM Corp.  Poughkeepsie,  NY

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Bobbie Justice
I take it you've seen this already as well. 

from: II14016 


Another message to be aware of is  MSGIEA043I MAXSPACE REACHED.
  This indicates a PARTIAL dump. At minimum, set DB2 using system
  to a reasonable level, in MVS Commands see these commands:
 DISPLAY  : D D,OPTIONS
 CHNGDUMP : CD SET,SDUMP,TYPE=XMEME,MAXSPACE=8000M
  
  MAXSPACE=8000M minimum for DB2 z/OS V8 & later 

Note: See II06471 : DUMPSRV uses AUX storage for dumping, you
 may need to add an extra PAGE dataset when dumping DBM1.
   Note: Allocate a hi-capacity device like a 3390 mod9 for dumps.
  **use ACS routines for size ***
   Note: With DFSMS120 and Dynamic Dump Allocation, multi-volume
 EFDS format datasets can be created for your SVCDUMP.

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Jim Thomas
Sir,

Try DSNTYPE=LARGE.


Kind Regards

Jim Thomas
617-233-4130 (mobile)
636-294-1014(res)
j...@thethomasresidence.us (Email)

-Original Message-
From: IBM Mainframe Discussion List [mailto:IBM-MAIN@bama.ua.edu] On Behalf
Of Norbert Friemel
Sent: Monday, October 31, 2011 12:29 PM
To: IBM-MAIN@bama.ua.edu
Subject: Re: Large SVCDUMPS

On Mon, 31 Oct 2011 10:04:01 -0700, Longnecker, Dennis wrote:

>Can someone point me to the right place to find the documentation on how to
create large SVCDUMP datasets?   I need to create a large dump dataset so I
can take a console dump of DB2.  I keep getting partial dumps.  I see in the
messages the maximum size of a dataset is 65,535 tracks which for a 3390 is
4369 cylinders.   In the z/OS MVS diagnosis: Tools and Service Aids manual
they say "IBM Recommends using extended format sequential data sets as dump
data sets for SVC dumps..and they can hold 128 gigabytes".But the
manual doesn't give any clues on how to create them.
>


http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/DGT2D490/3.6.10.2

DSNTYPE=LARGE is allowed for SVC Dump (z/OS 1.7 or later)
http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/IEA2V1B0/2.1.3
http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/DGT2D490/3.6.11


Norbert Friemel

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html



-
No virus found in this message.
Checked by AVG - www.avg.com
Version: 2012.0.1834 / Virus Database: 2092/4586 - Release Date: 10/31/11

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Norbert Friemel
On Mon, 31 Oct 2011 10:04:01 -0700, Longnecker, Dennis wrote:

>Can someone point me to the right place to find the documentation on how to 
>create large SVCDUMP datasets?   I need to create a large dump dataset so I 
>can take a console dump of DB2.  I keep getting partial dumps.  I see in the 
>messages the maximum size of a dataset is 65,535 tracks which for a 3390 is 
>4369 cylinders.   In the z/OS MVS diagnosis: Tools and Service Aids manual 
>they say "IBM Recommends using extended format sequential data sets as dump 
>data sets for SVC dumps..and they can hold 128 gigabytes".But the 
>manual doesn't give any clues on how to create them.
>


http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/DGT2D490/3.6.10.2

DSNTYPE=LARGE is allowed for SVC Dump (z/OS 1.7 or later)
http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/IEA2V1B0/2.1.3
http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/DGT2D490/3.6.11


Norbert Friemel

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Eatherly, John D
Dennis,
  We set up a SMS pool and do dynamic dumps to that pool. This command is what 
controls the size allowed for the dump.

CHNGDUMP SET,SDUMP,MAXSPACE=6000M

You can display by doing D D,S

This is an example of the commands to start the SMS MGMT part.

DUMPDS NAME=SYS0.DUMP&SYSNAME..D&YYMMDD..S&SEQ  
DUMPDS ADD,SMS=(MGMT=MCDUMP,STOR=SCDUMP,DATA=DCDUMP)
DUMPDS ALLOC=ACTIVE

Hope it helps. 


John Eatherly

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Re: Large SVCDUMPS

2011-10-31 Thread Mark Jacobs

On 10/31/11 13:04, Longnecker, Dennis wrote:

Can someone point me to the right place to find the documentation on how to create large 
SVCDUMP datasets?   I need to create a large dump dataset so I can take a console dump of 
DB2.  I keep getting partial dumps.  I see in the messages the maximum size of a dataset 
is 65,535 tracks which for a 3390 is 4369 cylinders.   In the z/OS MVS diagnosis: Tools 
and Service Aids manual they say "IBM Recommends using extended format sequential 
data sets as dump data sets for SVC dumps..and they can hold 128 gigabytes".
But the manual doesn't give any clues on how to create them.

IBM APAR II06335 talks about the subject also:  " Note: With DFSMS120 and Dynamic 
Dump Allocation, multi-volume EFDS format datasets can be created for your SVCDUMP. 
"  but doesn't give any clues on how to create such a beast either.

I've tried created the SYS1.DUMP dataset on a MOD 9 with the contig option, but 
it stops at the 65,535 tracks.

Any pointers appreciated.

Dennis


   
You specify whether a dataset is allocated in extended format in it's 
data class. I don't know off hand if SVCDUMP processing supports 
DSNTYPE=LARGE, but if so that would be another option.


--
Mark Jacobs
Time Customer Service
Tampa, FL


One of life's greatest mysteries is how the boy who
wasn't good enough to marry your daughter can be the
father of the smartest grandchild in the world.

Yiddish Proverb

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html


Large SVCDUMPS

2011-10-31 Thread Longnecker, Dennis
Can someone point me to the right place to find the documentation on how to 
create large SVCDUMP datasets?   I need to create a large dump dataset so I can 
take a console dump of DB2.  I keep getting partial dumps.  I see in the 
messages the maximum size of a dataset is 65,535 tracks which for a 3390 is 
4369 cylinders.   In the z/OS MVS diagnosis: Tools and Service Aids manual they 
say "IBM Recommends using extended format sequential data sets as dump data 
sets for SVC dumps..and they can hold 128 gigabytes".But the manual 
doesn't give any clues on how to create them.

IBM APAR II06335 talks about the subject also:  " Note: With DFSMS120 and 
Dynamic Dump Allocation, multi-volume EFDS format datasets can be created for 
your SVCDUMP. "  but doesn't give any clues on how to create such a beast 
either.

I've tried created the SYS1.DUMP dataset on a MOD 9 with the contig option, but 
it stops at the 65,535 tracks.

Any pointers appreciated.

Dennis

--
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html