Have a couple of questions that possibly someone who has been down this 
road before might be able to help with.  Have an application that has both a 
mainframe and a desktop component.  The z/OS side is the data generating side 
that creates some statistical data. The desktop is used for a GUI to display 
the data.  The current state of the application is that the z/OS side creates 
some summary data in CSV format, and the desktop picks it up using FTP.  I want 
to get away from summary data and FTP, and give the desktop application a 
'closer look' at the data before it is summarized.  The z/OS component already 
produces all of the data the desktop would need, but it is in character format. 
 I could just open a pipe between the desktop and z/OS, and shove the data down 
the pipe.  Desktop application is smart enough to parse out the data it needs 
from the records.  But, the volume of records could get fairly large in some 
instances.   Since it is character data, it probably lends itself well to some 
type of compression.  Does anyone know of a compression product that runs on 
both platforms?  

    2nd question.  The z/OS application could compute the amount of data to be 
transmitted over the pipe prior to creating the data.  What affects the speed 
of transfer besides the size of the pipe?  How important is the size of the 
data block on each PUT?  Thanks in advance for any help on this.

    --Dave Day

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to