<snip>
> We're currently using Gentran:Server NT and we have reached a limit with
> this tool.
> Some of our incoming batches are too large (20 - 30 Megs) for this
> translator to handle efficiently.
> It will translate them eventually, if they are 100% compliant, however if
> your batch has
> a syntax error or standard violation, gentran can't effectively diagnose
> the error.
> We find all of the database activity really slows down this translator as
> well.
>
> Thanks
> Anthony Beecher
> EDI Analyst
<snip>
Possible temp fix until you get different software:
I regularly translate batches of up to 50 meg using Gentran and this
is a problem. One characteristic of Gentran is that it translates at a
faster rate if the batches are smaller. Using the same transactions and
maps I found the following.
-- a 450k files took (n) minutes to translate
-- a 4500k file took 55 x (n) minutes to translate
I can't answer why a file 10x larger took 55x more time to
translate. Neither could Gentran support. It doesn't appear to be a RAM
limit or other hardware issue.
Since my applications allow it, I wrote scripts that feed Gentran
incremental batches of not more than 1 meg at a time. Some of my
transactions may be 2 or 3 megs each so I cannot split those, but feeding
Gentran bite size chunks speeds things up. In the example above, I
translate 10 batches of 450k instead of 1 batch of 4500k. It takes about 10
x n instead of 55 x n to translate.
This is a pain, but it is the only way I can keep traffic flowing.
...Bob
====================
Bob Garbowitz
Director of Technology
Beverage Data Network
110 Fairview Ave.
Verona, NJ 07044
Tel - (973) 239-1400 x104
Fax - (973) 239-1437
====================
=======================================================================
To signoff the EDI-L list, mailto:[EMAIL PROTECTED]
To subscribe, mailto:[EMAIL PROTECTED]
To contact the list owner: mailto:[EMAIL PROTECTED]
Archives at http://www.mail-archive.com/edi-l%40listserv.ucop.edu/