So, I'm writing a database population script that parses (sic) CSV
files, breaks the input into fields, and then pops it into an RDBMS
(MSSQL, Oracle, Sybase) or Access.  

Everything works fine, provided that I my input data is not the exact
length of the destination field while containing non-printable chars.
 I have for example, a varchar (255) field, into which I want to put a
string that contains 253 chars (including spaces) and two bell
characters.  

The bell characters are interpolated by the apps that use the RDBMSs as 
carriage return/new lines.  Why not use actual \r\n chars?  Uhhmmm..... I 
dunno...  The apps aren't mine, and they're not going to change just because I 
asked them to do so...

When I use any other data loader (BCP, for example) the chars go straight in.  
When I use my script, the bell chars get changed, I'm guessing, to Unicode.  

Now, I saw that the Text::CSV_XS module has a "binary" switch, and that 
DBI::CSV has a csv_csv meta-data parameter.  Do I use this to get the "binary" 
parameter passed through?  Pray tell, how?

Thank you,
amonotod

    `\|||/         amonotod@    | subject line: 
      (@@)         charter.net  | no perl, no read...
  ooO_(_)_Ooo________________________________
  _____|_____|_____|_____|_____|_____|_____|_____|

Reply via email to