Martin J. Evans wrote:
> Garrett, Philip (MAN-Corporate) wrote:
>> Martin Evans wrote:
>> 
>>>>> Hoping someone can help me here as I'm really stuck with this. I
>>>>> have a large amount of code which reads some XML and inserts data
>>>>> from the XML into a database. One particular insert fails:
>>>>> 
>>>>> 
>>>> [snip]
>>>> 
>>>>> After much head scratching I discovered the problem P4 has utf8
>>>>> set on it and downgrading it makes the code work i.e. when
>>>>> utf8::is_utf8 returns 1 on the timestamp scalar it fails and if I
>>>>> do a utf8::downgrade(scalar) on it before calling execute it
>>>>> works. 
>>>>> 
>>>> I think you've triggered this bug:
>>>> http://www.mail-archive.com/dbi-users@perl.org/msg26927.html
>>>> 
>>> I've read that but I don't see the resolution.
>>> 
>> 
>> There wasn't an official one, afaik. :-/
>> 
> Just checking. I think I got the wrong end of the stick - I thought
> you were meaning setting SQLCS_NCHAR fixes the problem but it appears
> it is the other way around.

I (wrongly) assumed that your problem and mine were the same. Setting
SQLCS_NCHAR fixed my particular problem.


>>>> If I remember correctly, the fix is to set ora_ph_csform:
>>>> 
>>>>   use DBD::Oracle qw(SQLCS_NCHAR);
>>>>   ...
>>>>   # default for all future placeholders
>>>>   $dbh->{'ora_ph_csform'} = SQLCS_NCHAR;
>>>> 
>>> I've tried that - no luck :-(
>>> 
>> 
>> Surprisingly, setting ora_ph_csform as above actually *caused* the
>> problem for me.  See below... 
>> 
>> 
> It makes my problems worse also as I was not setting it in the first
> place. 
>
>>>> The bug only manifests when you bind a utf8-on value to a statement
>>>> parameter that was originally bound to a non-utf8 value.
>>>> 
>>> The whole DBI interaction is:
>>> 
>>> $do = ['begin dbms_output.enable(:1); end;',undef,20000];
>>> do: 'alter session set nls_timestamp_format = 'yyyy-mm-dd
>>> hh24:mi:ss'' do: 'alter session set nls_date_format = 'yyyy-mm-dd''
>>> prepare_cached: 'select * from document where id = ?'
>>> execute: "XXXXXXXXXXXX"
>>> finish
>>> 
>>> The following is generated with a brand new statement handle:
>>> 
>>> prepare_cached:
>>> 'insert into document
>>> (book,category,country,expiry,id,last_update,mnem,name,ode
>>> c,route,source,tdate,ttimestamp,ttype,version)
>>> values(?,?,?,?,?,?,?,?,?,?,?,?,?,
>>> ?,?)'
>>> $execute = ['SD','HR','UK','2017-07-08
>>> 14:10:00','XXXXXXXXXXX','1183290197812','ES','XXXXX','N',
>>>
'534453','XXXXXXXXXX','2007-07-01','1183269593843','XXXXXX','XXXXX'];
>>> 
>>> I've even tried disconnecting just before the prepare_cached call
>>> and reconnecting or opening a brand new connection for the
>>> prepare_cached call - still no luck. So either, it is not the same
>>> bug or the resolution is not quite right.
>>> 
>> 
>> I think you're right. It's a different bug, but I'd wager it's
>> related. 
>> 
>> 
> See my other post - it looks remarkably similar to me now. For a start
> your example has a timestamp and a date as mine does.
> 
>> The attached script reproduces the problem reliably on my system.
>> There are three lines in the script that will trigger the error when
>> uncommented: lines 11, 12, and 19.
>> 
>> The problem manifests on (at least):
>>   DBI 1.50
>>   DBD::Oracle 1.18, 1.19
>>   Oracle client 9.2.0.4.0
>>   Oracle server 9.2.0.7.0, 10.2.0.3.0
>> 
>> What's *really* strange is that if I remove the "dt" column from the
>> table, and run the script using only the "ts" timestamp column,
>> there is no error. 
>> 
>  
> Same for me. I can insert a row containing only dates or only
> timestamps no problem - soon as I mix them - bang! What led me astray
> at first is that if I have a table with a timestamp and a date and the
> timestamp is in the insert list first, Oracle reports the timestamp
> parameter in error - so naturally you are drawn to that.
> If you then remove the date in the insert problem goes away so you
> start thinking it is the date field.
>
>> 
>> #!/usr/bin/env perl
>> use strict;
>> use warnings;
>> 
>> use DBI;
>> use DBD::Oracle qw(SQLCS_NCHAR);
>> use Encode qw(decode);
>> use utf8;
>> 
>> # uncomment either one of these to trigger error
>> #$ENV{'NLS_LANG'}  = 'AMERICAN_AMERICA.AL32UTF8';
>> #$ENV{'NLS_NCHAR'} = 'AL32UTF8';
>> 
>> my $dbh = DBI->connect( undef, undef, undef, # $ENV{DBI_DSN}, etc.
>>     { AutoCommit => 1, RaiseError => 1, PrintError => 0 })     ||
>> die $DBI::errstr; 
>> 
>> # or, uncomment this to trigger error
>> #$dbh->{ora_ph_csform} = SQLCS_NCHAR;
>> 
>> eval { $dbh->do('drop table utf8test cascade constraints'); };
>> $dbh->do('create table utf8test (dt date, ts timestamp)');
>> 
>> my $fmt = 'YYYY-MM-DD HH24:MI:SS';
>> $dbh->do("alter session set nls_date_format = '$fmt'");
>> $dbh->do("alter session set nls_timestamp_format = '$fmt'");
>> 
>> my $dml = 'insert into utf8test (dt,ts) values (?,?)';
>> my $sth = $dbh->prepare($dml);
>> 
>> my $utf8_date = decode('utf8','2007-07-05 12:00:00');
>> $sth->execute($utf8_date, $utf8_date);
>> 
>> 
> This is remarkably similar to the final test case I came up with that
> does not work. I've run the above exactly as it stands and it fails
> with "ORA-01843: not a valid month" i.e. I ran without uncommenting
> anything HOWEVER, my NLS_LANG is already set to
> AMERICAN_AMERICA.AL32UTF8 before the script was run (as per advice on
> making unicode/utf8 work with Oracle and DBD::Oracle).
> 
> Out of interest what did you end up doing in the end - make all your
> date/times timestamps?

Luckily for me, my problem was solved using the ora_ph_csform setting. I
suspect I haven't had your problem because the timestamps I'm using are
not from the outside world -- they're generated from DateTime objects
and won't have the utf8 flag set.

- Philip

Reply via email to