check out note 199416.1 - talks about export corruption on certain releases.

>>> [EMAIL PROTECTED] 1/8/03 2:44:15 PM >>>
Tracy Rahmlow wrote:
>
> We attempted to reorg a table and data corruption resulted.  We have isolated
> the issue and currently have a tar open with Oracle.  But basically, when we
> export the table with a buffer=10485760 the import process corrupts the data.
> The row count matches, however some fields that initially were null now contain
> data (there were some other odd data issues in addition).  No errors were
> produced during either the export or import process.  When we used a
> buffer=65536 the table data was correct.   This is reproducable.  Has anybody
> seen this before?  (AIX 4.3.3 / Oracle 8.1.7.3)  Can the buffer size be set too
> high?

This is just an idea based on past problems with the OCIs, but array
size (as in 'array fetch') is limited to 32K (smells of signed short
index). Your buffer size indirectly specifies the number of rows you
return with a single batch (something in the order of magnitude of
buffer / avg_row_len). It is not impossible, if your rows are short,
that your theoretical array size is past the limit and that Oracle
doesn't handle it properly.

--
Regards,

Stephane Faroult
Oriole Software
--
Please see the official ORACLE-L FAQ: http://www.orafaq.net
--
Author: Stephane Faroult
  INET: [EMAIL PROTECTED]

Fat City Network Services    -- 858-538-5051 http://www.fatcity.com
San Diego, California        -- Mailing list and web hosting services
---------------------------------------------------------------------
To REMOVE yourself from this mailing list, send an E-Mail message
to: [EMAIL PROTECTED] (note EXACT spelling of 'ListGuru') and in
the message BODY, include a line containing: UNSUB ORACLE-L
(or the name of mailing list you want to be removed from).  You may
also send the HELP command for other information (like subscribing).

Reply via email to