Hi ,
I am using perl 5.6 with Oracle 8i on redhat Linux 7.2
system, Iam facing probelm in speed of processing records,
I have around 30K records sets ( 1 record set means records
from 30 table) now in the database
(can go upto 80K) .
my Existing program is taking 16 hrs to read
On Friday, October 25, 2002 11:01 PM [GMT + 10:00 AEST],
Scott McGee <[EMAIL PROTECTED]> wrote:
> I am trying to get comfortable with DBI, and due to stupid
> circumstances beyond my control at the moment, have to try to make do
> without an actual database server. This means using something like
Following on with this, I set RowCacheSize to 1, but to no effect.
Regards,
Mike
On Sat, 2002-10-26 at 07:29, Mike Nielsen wrote:
Dear dbi-users,
I'm pretty sure this is a bug -- the question is whether it is in my
code or somewhere in DBI or DBD::Pg.
I have the followin
Following on with this, I set RowCacheSize to 1, but to no effect.
Regards,
Mike
On Sat, 2002-10-26 at 07:29, Mike Nielsen wrote:
Dear dbi-users,
I'm pretty sure this is a bug -- the question is whether it is in my
code or somewhere in DBI or DBD::Pg.
I have the followin
Roger,
Which version of DBD::ODBC are you using?
Regards,
Jeff
> We are trying to use Perl/DBI running on W2K to access UniVerse
> data stored on an AIX machine. We are using the UV ODBC drivers
> and have created a DSN to access the UniVerse data. We have no
> problem connecting to the databa
Looks like you (probably) have DBD::Interbase version 0.30 installed. Since version
0.40 is on CPAN, you should upgrade (download and install) and that should fix the
problem.
If that's not an option for whatever reason, and you're truly desperate you could
instruct Perl where it can find your
Dear dbi-users,
I'm pretty sure this is a bug -- the question is whether it is in my
code or somewhere in DBI or DBD::Pg.
I have the following code (platform details in attached file, but,
roughly, perl 5.6.1 on a (reasonably aggressively updated) mandrake 8.1
with the kernel at 2.4.8-26 on a 1.
We are trying to use Perl/DBI running on W2K to access UniVerse data stored on an AIX
machine. We are using the UV ODBC drivers and have created a DSN to access the
UniVerse data. We have no problem connecting to the database or running queries via
perl/dbi, except that the fetchrow_array() stop
I had upgrate RedHat from 7.3 to 8.0
And my application don't work!
Probably it don't find InterBase.pm
Please, Help me! What can I do?
There is some information about the problem.
SHELL
===
../loginuser.pl
Content-type: text/html
Ago
I'm getting this error:
DBI fetchall_hashref: invalid number of parameters: handle + 0
Usage: $h->fetchall_hashref() at test_xsl.cgi line 59.
when I try to use fetchall_hashref .
code sample:
$dbh code snipped ...
my $sql=qq{select col1,col2 from some_table};
my $sth=$dbh->prepare($sql)
Scott McGee wrote:
SQL::Parser v1.004
SQL::Statement v1.004
> ...
my $query = "SELECT megaliths.id, megaliths.name, site_types.site_type
Yes, there is a bug in SQL::Statement related to table names combined with a column name (e.g. megaliths.id).
I will be releasing a fix later today
<[EMAIL PROTECTED]> wrote in message
news:1842904766.20021025094128@;comcast.net...
> It appears from the error message that it cannot locate the CSV file
> at all. Are you sure:
>
> $dbd = "DBI:CSV:f_dir=magalith_db";
>
> Is correct? You've spelled 'megalith' as 'magalith'. Just have to
> check
OS: Linux 2.4.18, Suse 8.0
Perl: 5.6.1
DBI: 1.20
DBD: 1.12
RDBMS: Oracle Standard Edition 9.2
Dear All,
please be patient if this is an often asked question, but I tried the
RTFM, different newsgroups and google for many hours with no result that
fits my problem.
I'm trying to read a clob return
On Friday, October 25, 2002, at 08:14 AM, William McKee wrote:
I'm not sure which it would be either. Perhaps someone on the list who
is
familiar with DBI or the Pg driver could ellucidate. However, I suspect
the solution would have to be similar to what you've done with
capturing
STDERR since
On 25 Oct 2002 at 9:58, Anders Bogdan wrote:
> If it errored (returned
> undef), I'd be able to pick up the error message from $dbh->errstr but
> because it was successful, there is no way to get the result of the
> command. I'm not sure if this is a DBI or DBD-Pg or a Postgresql issue.
I'm not su
William, thanks for the reply.
I'm not using command line vacuumdb.
My code deletes rows older than a certain date and then to recover the
space does:
my $retVal = $dbh->do('vacuum verbose analyse my_table').
The do() method behaves as documented, returning a -1. If it errored
(returned undef),
On 24 Oct 2002 at 13:41, Anders Bogdan wrote:
> Kind of a newbie-ish question I guess. I could not find a good answer in
> the faq or the Perl DBI book.
Hi Anders,
Since noone else has taken a stab, I'll jump in and see if I can offer an
alternative. However, my first thought would have been to
It appears from the error message that it cannot locate the CSV file
at all. Are you sure:
$dbd = "DBI:CSV:f_dir=magalith_db";
Is correct? You've spelled 'megalith' as 'magalith'. Just have to
check. I assume you've got whatever dir on the same level as your
script and that you've got a file
I am trying to get comfortable with DBI, and due to stupid circumstances
beyond my control at the moment, have to try to make do without an actual
database server. This means using something like DBD::CSV. I am just trying
to go through the O'Reilly Perl DBI book, so don't need anything too complex
This may not help, but I implemented an attribute like this for DBD::ODBC
where someone wanted to create a trigger and the trigger had :new.foo type
syntax and DBD::ODBC was doing the exact same thing...If it's plausible, you
*might* be able to switch over to DBD::ODBC in the short run for the DDL
Paul,
I think this is in the driver. The problem is, in the case where you are
not binding the parameters, I believe the Server itself is handling the data
and converting it. In the case where you are binding, I believe the driver
is attempting to parse along the ODBC rules (not the SQLServer ru
On Wed, Oct 23, 2002 at 09:41:19AM -0700, Todd Enersen wrote:
>
> What am I missing? I believe Makefile.PL links to my Oracle 9i libraries
> correctly when it's linking:
> LD_RUN_PATH="/disks/sc-soldev2-007/oracle/product/9i/lib32:/disks/sc-soldev2
> -007/oracle/product/9i/rdbms/lib32" /usr/local
On Fri, Oct 25, 2002 at 06:26:29AM +1000, Steve Baldwin wrote:
> Thanks Chris,
>
> I have tried this on DBI 1.30 and DBD::Oracle 1.12, but see exactly the same
> behaviour. I suspect it is a DBD::Oracle thing coz as far as I know, named
> bind variables (as opposed to ?) is an Oracle thing.
>
23 matches
Mail list logo