Satish,
This error is typical if you are running PPM inside a company
Intranet, i.e. behind
a firewall. There are two problems to fix-
The first is fairly easy, you need to tell PPM to use your corporate
proxy server.
Unfortunately I am not connected to my intranet setup at the moment and
I
would be a
big gain for little pain.
Regards: Colin
colin_e wrote:
I have recently switched to using ActiveState perl on my production
Solaris box as well as on my
Windows machine for several reasons, including the fact that it's
considerably easier to get PPM
working under ActiveState.
Hi guys. Presumably I am doing something silly here, but I can't see it.
I am using
the formline function to write formatted text into a string, in the
manner recommended
in the "Camel Book" (Programming Perl, O'Reilly ). Basically what I want
to do is
build a set of numbered footnotes in a stri
Jason,
You are indeed slurping the whole file into memory with
the @logFile = construct. If you know your logs are big, and
you don't need to do arbitrary seeking around inside the file, don't do
it this way.
Line-based processing, as in...
open(LOGFILE,"
while() {
$Bill Luebkert wrote:
That's because you didn't close $command before trying to delete
the temp files. DIR still has them open for write and unlike
UNIX, Doze can't delete files that are in use.
Thx Bill. (also thanks to Howard Tanner for a similar suggestion)
I understand what you'r
"$Bill Luebkert" wrote-
I'm not comvinced. You're probably doing something else wrong.
Petr Vileta wrote-
Try: unlink("\"C:/DOCUME~1/colin/LOCALS~1/Temp/Test_Schedule_1.000_OUT_3780.txt\"");
You are right to be suspicious, but it's taken me several hours of
head-scratching to nail this down. It's
Perl: v5.8.3 MSWin32-x86-multi-thread
OS: Windows XP SP1
Solaris is my production platform, but I mostly develop on my home PC
for convenience.
Perl's cross-platform compatibility is generally very good, but i've hit
an odd path problem-
Although the Windows command shells use "\" as a path sepa
Capturing the STDOUT and STDERR streams from a forked child process is
well-covered in the
Perl docs. However if possible I want to capture my own process's STOUT
and STDERR, and this is
not so obvious.
Scenario:
I have a number of separate admin scripts. These may generate small
amounts of
The DBI docs warn that in general queries that return results (i.e.
SELECTs) cannot
reliably report how many rows are in the result without "fetchrow_"
fetching all the
rows.
There are 2 basic ways to real with this-
1) If you are confident your results set isn't going to be stupidly
large,