On Mon, 16 Feb 2026 12:17:59 -0500, David Clark <[email protected]> wrote:
>I agree marketing doesn't care when something is hidden in a macro > But in this case, the demand is coming from the COBOL side. Ask yourself how you hide logic in Cobol. I suspect Cobol OOP would be my first choice. Alternatively, you could use COPY REPLACING in the procedure division. Not great but far better than requiring user deal with migration changes. I don't know Cobol but if this were C or C++, I would be thinking OOP or macro. >But that does not stop them from coding their own >parameter block for the very reason that they > *want* to use the DEPENDING ON phrase. How and what you expose to the user is your choice. IBM typically docs programming interfaces and anything else in that list is use at your own risk. When programming interfaces use a macro, then you have full control. If there is a need for DEPENDING ON, then you can add it when needed instead of risking compatibility problems in the future. Will user redefinitions cause you future headaches? You should be the person making decisions on how to best solve each requirement because with complete flexibility comes chaos. >> because you have no control how this macro is implemented. >What macro? If access is solely thru Cobol, then it's an OOP object, copybook or ????. >> The correct solution is to code a second DSECT instead of ORG. >How is that going to work when the existing code needs to continue >referencing the original field names for length and string value? From the user/cobol caller side, if the old implementation simply disappears when compiled, then you only need the new DSECT. If both implementations are needed in the same program, then the object is either using the new class or the old class. In fact, with OOP, you can get utilize methods based on the arguments. >> Not re-assembling code is the basis for this question >Well, at least not recompiling the COBOL code (that uses a dynamic CALL). Peter's post gave you the simple solution for statically linked calls. As I said before, not re-assembling / compiling is the root requirement for backwards compatibility. As for re-link of the load module, it is only necessary if your old interface program is incompatible with your new interface program. If both are compatible, then re-link is unnecessary. You compatibility design may require the new interface program have a new name but that won't be a problem because the cobol object has the same method using 2 different modules. Or maybe you specified different names in each copybook for the call. Or however your design deals with the alternate name. >COBOL programmer is free to CALL it any way they like. In the professional world, we typically don't give complete freedom. Instead, we provide documented programming interfaces and add new ones as needed. > But they have been told that they can code their own parameter blocks > for any of the three parameters as fits their needs. This is a fallacy of Unix programming (you can do anything you want that meets your needs) that often results in headaches. All IBM and OEM vendor supplied programming interfaces are very well defined. > The calling statement is *not* in a copybook > because it has a varying parameter list length > (even in the legacy version). OOP realized long ago that public data was a bad idea and they recommended using methods over access data publicly. >The root issue you are having is that the subroutine must be able to >identify which calling parm block was used by the caller. Tony is correct. His solution works easily as well as Peter's solution using alternate module names.
