Re: how do I "file lock" for a form?

2003-08-22 Thread Rich Parker
Hi. When a user hits "submit" many times after completeing a form, I get multiple (duplicate) entries in my spreadsheet. I've read that I can add a hidden field with a unique identifier to "lock the file" so this won't happen. I don't know how to do this - can anyone explain/direct me to a good info source (I am a newbie). Thanks, Gregg

-
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
I've had this issue also, no matter how often YOU put the "ONLY click 
once" somebody will do it twice, or more. I found a way by using DHTML 
to make it so that the SUBMIT button only works ONCE, I found it on the 
Dynamic Drive web site here:
http://www.dynamicdrive.com/
They have some excellent ways using JS to stop the problems you are 
having.. The only problem that you can run into, is browser levels, if 
the user is on an older one the code doesn't work. So if you have some 
controls on this, like it is a company Intranet, as here, then you can 
use their ideas, try to use their "Cross browser techniques".

Happy coding.
--
Rich Parker
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Using hex.

2003-08-21 Thread Rich Parker
Thanks guys,
This is what I was looking for, more of those undocumented way to do 
things. But with the way programmers "Do things" do you find there is a 
"Standard" generally when it comes to dealing with hex 
variables/characters??

Thanks.

zsdc wrote:

Rich Parker wrote:

$loc = index($rec, $HexValue);  OR
@sub_field1 = split(/$HexValue/, $rec);
Where the $HexValue is either a variable that contains my hex'01' or 
the absolute value itself.


The chr function returns the character with a given ASCII (or Unicode) 
value and you can use \x01 inside of double-quoted strings, so e.g. to 
have a space, you could write:

$char = chr 32;# decimal
$char = chr 0x20;  # hexadecimal
$char = "\x20";# hexadecimal
$char = "\040";# octal
$HexValue = v32;   # decimal
or
$string = "abc\x{20}def 123\x{20}456";
etc.
Is that what you need?

Take a look at perldata manpage:
http://www.perldoc.com/perl5.6/pod/perldata.html#Scalar-value-constructors
and Quote and Quote-like Operators in perlop manpage:
http://www.perldoc.com/perl5.6/pod/perlop.html#Quote-and-Quote-like-Operators 

-zsdc.


--
Rich Parker
http://www.fssi-ca.com
mailto:[EMAIL PROTECTED]
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: File sizes.

2003-08-20 Thread Rich Parker

On Tue, 19 Aug 2003 14:20:47 -0700, Rich Parker <[EMAIL PROTECTED]> wrote:

Hi,
I have been watching the thread about the file::copy. I ran into an 
issue in the Linux environment that brings a serious question, MAX file 
size. Keep in mind the server is running 7.0 RH, we have 7.2 Enterprise 
Server also, and we pay for support. But even the RH support says they 
can't handle files in excess of 2GB (approx). I was using TAR, GZIP, or 
most any functions, I have found that the targeted file is only 1.8GB 
instead of being a much larger file, in our case 16GB. This was on a 
"/mnt" device, not a local disk. So the COPY (TAR in this case) was from 
one "/mnt/" device to another, it did not matter if I used TAR, COPY, 
MOVE, or a Perl program, same problem.

Everyone I talked to about this on the various "Groups" only said 
"Rebuild the kernel using 64 bit support", but this is on an Intel box 
(32 bit?). Have any of YOU seen this problem? I can't be the only person 
dealing with large files. Ideas?? How is this issue on later releases??



I am no kernel hacker so take what I say with a grain of salt.  The large file size has to do with the addressable space on the disk which to support over 2 gigs you need more "bits" to produce longer addresses, which is I believe why they suggested you add 64 bit support.  Its been a while since I was doing kernel builds but I thought there was a specific switch for "large file size", but I thought this was specifically to support partitions of larger than 2 GB not files themselves, but maybe they are one in the same.

Now you mention that the file is 1.8 GB, is that machine readable or human readable, aka is that where 1 KB = 1000 bytes or 1024 bytes?  It is likely that your file exceeds the 2 GB boundary if the 1.8 is human readable.

I am not sure about copy, theoretically it should work if the file can be addressed completely, move won't work accross file system boundaries anyways, nor will a 'rename' in Perl. Again because Perl is talking to the underlying kernel theoretically you would need large file support in the kernel first, but then you *ALSO* need it in the 'perl' (not Perl) executable. For instance, perl -V will have something near the bottom like:

Compile-time options: ... USE_LARGE_FILES ...

Though I am also not a Perl internals hacker so I don't know what all this adds, but I suspect it is needed in your case if you do use a Perl script.

To my knowledge this has been fixed in 2.4 or newer kernels (are you running 2.2?), or it was fixed by default from the jump from RH 7.x to RH 8.0. 

Maybe one of the real gurus can provide better explanation/help...

In any case you may get better help asking on a Linux kernel list...

http://danconia.org


You have a very good point, I've seen that "LARGE_FILES" thing in the 
set up, however, the people at RedHat said not to do that, but rather 
wait for the next release of the 2.4 kernel, at that time (About 6 
months ago) 2.4 was real "Buggy" according to them. Yet the current 
"Advertised" release of RH is 9.0!! Which makes me wonder about it, the 
stuff you can pay "Support" for is way back on the release scale. Here 
at work we also have a S/390 running VM and I've been trying to get the 
"Powers at be" to allow me to use the Linux and all of the things that 
go with that, gee, like PERL, but it has been a real up hill battle. If 
any of you can give me a GREAT reason to help me convince them, then I'm 
"All ears". I can see the "Bennies" of having a whole bunch of servers 
on ONE box, but it's very difficult to get them to the next step, $30K 
for TCP/IP for VM, which we would need. But then that 2GB limit hits me 
square in the face again. To answer your question about the 1.8, YES, 
when I use ANY piece of software, or do an LS, for example, it only 
shows 1.8GB when on the WinNT machine where the files sits, it shows 
16GB, for example. Didn't matter which piece of software or what 
"command" I was using. I don't think I would see this if I was using 
Perl in a Win32 arena, but with all of the troubles I had pushing huge 
amounts of SQL data through the cgi interface, I had to abandon the 
Win32 for the more stable and less "Buggy" Linux, but then I ran into 
the 2GB limit. Looks like WE have to wait until the Enterprise edition 
gets the newer kernel, agreed? But I HATE waiting... Cal me impatient...

Thanks...



--
Rich Parker
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: File sizes.

2003-08-20 Thread Rich Parker
Hi,
I have been watching the thread about the file::copy. I ran into an
issue in the Linux environment that brings a serious question, MAX file
size. Keep in mind the server is running 7.0 RH, we have 7.2 Enterprise
Server also, and we pay for support. But even the RH support says they
can't handle files in excess of 2GB (approx).


i believe RH 7.1 beta r1 (code name Fisher) which uses kernel 2.4.0 is the 
first RH that supports the LFS (Large File Support) extension. your server 
running 7.0 won't be able to address > 2GB file. If you have 7.2 
Enterprise, why don't you use that instead? If you pay for support, isn't 
RH suppose to provide help / instruction on how to get your 7.0 with LFS 
support?

if you simply want to know if Perl is able to deal with > 2GB file, you can:

[panda]$ perl -V | grep 'uselargefiles'

and you should see something like:

uselargefiles=define

to see that if perl (the binary) is compiled to use LFS API, use:

[panda]$ perl -V | grep 'OFFSET_BIT'

and if you see something like:

D_GNU_SOURCE -fno-strict-aliasing -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64

you are in good shape. the easiest solution to get LFS support is to upgrade 
your 7.0 imo.

david

David,
Thanks for the observations. But the point that RH made to me (Again, 
about 6 months ago) was that this issue was in 7.0-7.2 and YES, they did 
not recommend rebuilding the kernel because at THAT time the 2.4 was 
"Buggy" in their opinion. I am not so worried about Perl being able to 
READ/WRITE or what ever for large files, but the O/S has to be able to 
do this first, correct? That's my main point and the obstacles I've run 
into on this. It seems like to me, it is time to revisit this with the 
RH folks to see what THEY say about it, then go through the pain of 
upgrading a server with a ton of perl code on it, of course, everything 
must be TESTED to make 100% sure I haven't dropped anything through the 
cracks.

Thankx...



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Using hex.

2003-08-20 Thread Rich Parker
Howdy,
I use many different languages here at work, from Assembly, Rexx, to 
other mainframe languages. One thing I find it difficult to do with Perl 
is the handling of Hex characters. This may seem like a very generalized 
question, forgive me for trying to find a "Better way to do it". Picture 
having a flat file, or an SQL table (Doesn't matter that much), I want 
to delimit a field within a field, like having options. I can't use the 
TAB (\t) character, because when I use the "foreach" the sub fields also 
get "Split", therefore loosing what I am attempting to do. So I wanted 
to use a hex character, let's say something simple, hex'01' for example. 
What do you guys use for saying something like:

$loc = index($rec, $HexValue);  OR
@sub_field1 = split(/$HexValue/, $rec);
Where the $HexValue is either a variable that contains my hex'01' or the 
absolute value itself.

I've used 'sprintf' to limited success, but in other languages this is 
very simple. I'm looking for some different ways to do this, after all, 
that's what Perl is all about right? TIMTOWTDI...

What's some of your things you guys use??

Thanks, ahead of time.

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Executable perl program help!!

2003-08-20 Thread Rich Parker


[EMAIL PROTECTED] wrote:

Also try

http://www.indigostar.com/



Laurent coudeur

Quick question about this, sounds great. But if I were running my Perl 
from an http call (Normal browser usage), what would the "exec cgi" call 
"Look like", syntax wise??

Thanks.



Bob Showalter <[EMAIL PROTECTED]>
20/08/2003 16:20
 
    To: "'Rich Parker'" <[EMAIL PROTECTED]>, [EMAIL PROTECTED]
cc: 
Subject:    RE: Executable perl program help!!

Rich Parker wrote:

I was at Active State the other day, they have one that can be
purchased. I have seen a few others when I did a similar search as
mentioned. I haven't seen one for free or one that has a demo for it,
I'd love to try one, if anyone sees one, let everyone know about it.


You can download a demo version of ActiveState's Perl Dev Kit:

<http://www.activestate.com/Products/Download/Register.plex?id=PerlDevKit&a=
e>


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: Executable perl program help!!

2003-08-20 Thread Rich Parker
I was at Active State the other day, they have one that can be 
purchased. I have seen a few others when I did a similar search as 
mentioned. I haven't seen one for free or one that has a demo for it, 
I'd love to try one, if anyone sees one, let everyone know about it.

Thanks.

Ramprasad A Padmanabhan wrote:

[EMAIL PROTECTED] wrote:

How can i generate such a file?


Million dollar question , No perfect answer.
Depends on what OS you are using.
Do a google on perl2exe

For starters try the O.pm. To convert your perlcode to C and then 
compile the C code

On linux You can compile using
export LDOPTS=`perl -MExtUtils::Embed -e ldopts`
export CCOPTS=`perl -MExtUtils::Embed -e ccopts`
perl -MO=C script.pl > script.c
gcc $CCOPTS script.c -o script $LDOPTS
This  works only on some scripts. The O.pm is still evolving

Ram


--
Rich Parker
http://www.fssi-ca.com
mailto:[EMAIL PROTECTED]
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


File sizes.

2003-08-19 Thread Rich Parker
Hi,
I have been watching the thread about the file::copy. I ran into an 
issue in the Linux environment that brings a serious question, MAX file 
size. Keep in mind the server is running 7.0 RH, we have 7.2 Enterprise 
Server also, and we pay for support. But even the RH support says they 
can't handle files in excess of 2GB (approx). I was using TAR, GZIP, or 
most any functions, I have found that the targeted file is only 1.8GB 
instead of being a much larger file, in our case 16GB. This was on a 
"/mnt" device, not a local disk. So the COPY (TAR in this case) was from 
one "/mnt/" device to another, it did not matter if I used TAR, COPY, 
MOVE, or a Perl program, same problem.

Everyone I talked to about this on the various "Groups" only said 
"Rebuild the kernel using 64 bit support", but this is on an Intel box 
(32 bit?). Have any of YOU seen this problem? I can't be the only person 
dealing with large files. Ideas?? How is this issue on later releases??

THanks.
--
Rich Parker


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Re: New to list, question.

2003-08-18 Thread Rich Parker
Very interesting reading, Thanks,
Actually, it "Could be" great reading to "Sleep by". But I have a 
concern about all of this, be that a MS "Controlled" .NET or even an 
"OPEN source" Mono. Who is going to "control" where all of this resides? 
Let's take the MS idea of .NET for a moment, remember WHEN the Justice 
dept tried to "Go after" MS because they had too much control of things? 
Regardless of my personal opinions on this (Bill who?) doesn't this 
whole idea give Mr. Gates more of the opportunity that IF anyone tries 
"That" again, the HE could "Pull the plug" and everyone who is using 
.NET now can't make their programs run? Isn't that a REAL possibility? 
Or let's look at it from the OPEN Source side, if ALL of this "common 
code" (For lack of a better word right now) was at one or a "Handful" of 
sites, and let's just say there was a "Power outage" (Remember the great 
power outage of 2003?), then what? The rest of us are still working, but 
WE can't access the needed files to make our Internet or Intranet sites 
work, so now we start having a cascade failure of the NET (No DOT here, 
everyone). From a company perspective, the concept of .NET is wonderful, 
as long as "Our source code" is just that, "OURS". But from the poor guy 
who suddenly NOW has to pay MS (Back to .NET again) a monthly fee just 
to OPEN a Word Doc, I don't see the public going for that. Do I have 
this wrong? Don't tell me Mr. Gates is going to keep the basic modules 
for MS-Word free for everyone?? Yea right. I'm just concerned, that's 
all, I love the idea of "Every developer" getting his "Fair share" for 
the bits & pieces he creates, and YES, we are NOT reusing our code 
enough, but is THIS the correct solution??

Anyone got anymore links for me? I noticed that when I did that search 
on the "de icaza .net" (without the quotes, duh?) that the TOP article 
was on O'Reilly, I like their books, don't get me wrong, but will they 
publish someone else's "Point of view", even though de Icaza seems like 
a "Sharp cookie" and admittedly, this is this first time I've heard of 
him. I'd like to hear what others have to say. I've been developing Perl 
based web sites for a long time, what I've always liked about it was/is 
the fact that it actually works and it's included with the O/S's like 
Linux (Ok, use the word - FREE) and I just feel that if Bill gets His 
way, that we'll all loose out on the word FREE. Now if this idea was 
like a "Newsgroup" where the various news servers got reloaded from time 
to time and the code wasn't just in ONE spot, I might be willing to 
really support it, but the more I learn about this, the more questions I 
have. Somebody HELP me understand why this is the NEXT thing in 
software. I just don't get it.

Thanks.

McMahon, Christopher x66156 wrote:

	I'll take this a step further.  Search Google for "de icaza .net".
Miguel de Icaza produced the GNOME desktop and founded Ximian/Mono.  He's
quoted all over the place discussing what .NET has to offer beyond the MS
arena, for Open Source and wide interoperability.   I think that de Icaza
explains .NET better than Microsoft has... at least from a developer's point
of view.
	ActiveState has a Perl environment "Visual Perl" that seems to be
capable of interacting in a .NET sort of way with other .NET languages,
which is pretty cool. 
	.NET is worth investigating if for no other reason than that it
seems to be the MS answer to Sun's Java *and* IBM's WebSphere.   And it has
a lot of room to grow.  
	I'm interested because I just got a job with an all-Windows shop
migrating apps from C++ to .NET.  Except for Windows workstations and a
little bit of MS ODBC, all my professional experience has been on Tandem,
Solaris, and AIX.  I hope .NET is as cool as de Icaza says it is, because
Windows kinda creeps me out.  It's just very weird having all of that
abstraction/secret code between me and what the OS is doing.  
	-Chris 

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Monday, August 18, 2003 1:45 PM
To: Rich Parker; [EMAIL PROTECTED]
Subject: RE: New to list, question.



On Mon, 18 Aug 2003 12:40:02 -0700, Rich Parker <[EMAIL PROTECTED]>
wrote:

Hi,
I've been involved with Perl for many years (too many to count right 
now). One question has been bugging me lately, and maybe it's because I 
don't know enough about it, or maybe it's because I've been "Not the 
biggest fan" of Mr. Bill and his org... Either way, I'm attempting to 
learn about .NET (Don't pani

New to list, question.

2003-08-18 Thread Rich Parker
Hi,
I've been involved with Perl for many years (too many to count right 
now). One question has been bugging me lately, and maybe it's because I 
don't know enough about it, or maybe it's because I've been "Not the 
biggest fan" of Mr. Bill and his org... Either way, I'm attempting to 
learn about .NET (Don't panic here, I'm just looking for some information).

Mainly, how does, or can the .NET framework be of benefit to the Perl 
developer at large? Am I going to get any "Bang for my buck" using Perl 
in the .NET environment? But WAIT, I'm working on Linux Systems, how 
does that effect what I am asking?? If someone could provide me a link 
to some articles about .NET from many different perspectives, preferably 
NOT by MS (If you catch my drift here). I want to see if this framework 
is going to be something of use to me.

Thanks for ANY help you can provide.
--
Rich Parker


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]