This is just happening now however Sri offered to help me set this up with 
DFSORT 

If I send him DSNames unfortunately 
Since I work for the IRS I don’t know if can do that very big on privacy issues 



> On Oct 8, 2020, at 12:09 PM, Michael Stein <m...@zlvfc.com> wrote:
> 
> On Wed, Oct 07, 2020 at 07:41:58AM -0400, Joseph Reichman wrote:
>> The average number of records in the file could be 240,000 each record
>> could be close to 10,000 bytes there are 4,644 files
> 
> So each file could be up to: 10000 * 240000 -> 2.4 GB
> 
> And the maximum total data is: 2.4 GB * 4644 -> 11.14 TB
> 
> Reading 11 TB in 4 hours (?) is:
> 
>  11e12/(4*3600.) -> 763 MB/second
> 
> Most likely all the records are less than the maximum (10K bytes) size
> and there is less data that this.
> 
> That's a lot of records too: 240000*4644 -> 1114 million records.
> 
> How long does a billion (1000 million) QSAM GET locates take?
> 
> I have no idea how long a QSAM GET takes.  But a billion of them
> add up and it's all CPU time.
> 
> What's the blocksize of these datasets?  The CPU for each GET
> is just bumping a pointer and checking for the end of the current
> block.  There's more CPU involved in switching blocks.
> 
> I'm still wondering:
> 
>  Is this a one time job or recurring every year, month, week, day?
> 
> ----------------------------------------------------------------------
> For IBM-MAIN subscribe / signoff / archive access instructions,
> send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to