Re: GnuPG::Interface module on OS X

2006-09-24 Thread Wiggins d'Anconia
Dennis Putnam wrote:
 Although I don't think this is an OS X specific issue I can't find  any
 place to seek help (there seems to be a GnuPG list but it is  defunct or
 inactive). If someone knows of a better resource please  let me know.
 
  I have installed GnuPG on a Tiger (10.4.7) server and it seems to  be
 working fine. I then installed GnuPG::Interface in perl and wrote  a
 script that tries to decrypt a file. Everything seems to be working 
 fine and the file gets decrypted. My problem occurs when I try to run 
 the script in background (cron or nohup). I get an error pointing to 
 the line that calls the 'decrypt' method. It says fh is not  defined.
 I don't have a variable by that name so I don't have a clue  what it is
 referring to other then it must be in the decrypt method  somewhere. I
 tried setting $gnupg-options-batch(1); but that did  not help. Can
 someone help me figure out what is wrong? Thanks.
 

Can you show some code?  Note that cron generally runs in a different
environment and may not be detecting the proper home directory which
would likely cause gpg to have issues.

http://danconia.org


Re: question on Perl's ssh

2005-06-24 Thread Wiggins d'Anconia
Ted Zeng wrote:
 Hi,
 
 Why the system behaves differently when I ssh to a machine
 from Terminal than when I ssh to the same machine by Perl's SSH module?

 Here is the problem:
 I added a tool to /usr/local/bin. I updated the profile file.
 Now if I ssh to the machine, I could use which tool to find it.
 
 But when I try to do the same in Perl, the reply is
 no tool in /usr/bin /bin /usr/sbin /sbin
 
 ted zeng
 Adobe Systems
 
 

The difficult part is that the answer is really just because :-). When
you use the 'ssh' from the terminal you are using the local ssh client.
That client establishes a connection to a remote ssh server and tells
that server that it wants to run a command, that command is to init a
shell and then interact.  So when you send your 'which' command you are
interacting with the remote shell over the ssh tunnel. But when you
use Net::SSH::Perl (which I am assuming is the module you are using) you
are establishing a connection to a remote SSH session, but the
command(s) you send are being exec'd directly (presumably by /bin/sh)
which may or may not have loaded the same environment as the normal user
shell (for instance, skipping any .bashrc/.bash_profile, etc.). I
believe (though haven't tested) that the same would occur if you
provided a command to the local ssh client instead of requesting an
interactive shell.

Net::SSH provides a wrapper around the local 'ssh' command but I have
not used it. I tested it once quite a while ago and preferred
Net::SSH::Perl *for my purposes*.

HTH,

http://danconia.org


Re: CamelBones on Intel? Maybe not.

2005-06-07 Thread Wiggins d'Anconia
Ian Ragsdale wrote:
 On Jun 7, 2005, at 11:51 AM, Joseph Alotta wrote:
 
 I used to be a NeXt developer.  This announcement is very  reminiscent
 of the NeXt announcement to stop making those little  black boxes and
 bring NeXt OS on Intel chips.  We had just bought a  ton of hardware
 and they demo this clunky 386 PC.  First of all, it  looked nasty.  We
 were used to that elegant design. Secondly, it  kept crashing.  It
 destroyed the culture.  It was like putting  Haydn into the juke box
 at a disco.  Everyone went home. The vice  president of our division,
 who bet his career on NeXt, resigned and  NeXt languished for years.

 It is the same scenario playing out again.  Will Steve Jobs never  learn?
 
 
 Did NeXT produce their own boxes, or did they allow installs on any  PC
 with supported hardware.  I believe that is a key difference.   Apple
 boxes will be exactly the same as they would have been, except  they
 will have a different CPU.  You still won't be able to install  OS X on
 a commodity PC without jumping through a lot of hoops.
 

Why wouldn't you?  Memory, drives, video, etc. are all the same right
now. Motherboard has pretty standard features, other than it is setup
for a Power processor. Apple has been going cheap for a while, SCSI -
IDE ring any bells? It would be a real shame if they didn't allow you to
install OS X on any commodity PC, once again back to that whole volume
issue. Without a different chip, Macs really are just a pretty looking
box with a nice software package preinstalled. Darwin runs on Intel
already (mostly) which is the real key, if Apple goes through with this
and won't let you install on a commidity PC then they really missed the
boat, in fact I would say they couldn't even find the dock.

 I think the only way that you look at it is that if IBM couldn't or 
 wouldn't deliver the processors Apple needed at a reasonable price, 
 what else could Apple do?
 

Will definitely agree with you there. Though you have to love the media
spin making it seem like this is Apple's choice to drop IBM, uh huh.

 Ian
 

I like Macs as much as the next person, but if they are going to go the
Intel route, they might as well go the whole way. In fact being able to
install on a normal Dell, would be one way for them to win back some
huge user spaces, lots of companies would love to get out from the M$
licensing structure, but just aren't willing to fork out that much cash
for all new hardware when they shouldn't need to, aka just to run
another Intel based OS, and admittedly Linux is much harder to learn (or
at least seems it). Not to mention theoretically (ask your lawyer,
anyone know for sure?) they should be able to transfer over their
Adobe/Office licenses which run natively.

http://danconia.org


Re: CamelBones on Intel? Maybe not.

2005-06-07 Thread Wiggins d'Anconia
Brian McKee wrote:
 
 On 7-Jun-05, at 1:57 PM, Wiggins d'Anconia wrote:
 

 Why wouldn't you?  Memory, drives, video, etc. are all the same right
 now. Motherboard has pretty standard features, other than it is setup
 for a Power processor. Apple has been going cheap for a while, SCSI -
 IDE ring any bells? It would be a real shame if they didn't allow you  to
 install OS X on any commodity PC, once again back to that whole volume
 issue. Without a different chip, Macs really are just a pretty looking
 box with a nice software package preinstalled. Darwin runs on Intel
 already (mostly) which is the real key, if Apple goes through with this
 and won't let you install on a commidity PC then they really missed the
 boat, in fact I would say they couldn't even find the dock.
 
 
 Quoting cnet  
 http://news.com.com/Apple+throws+the+switch%2C+aligns+with+Intel+-
 +page+2/2100-7341_3-5733756-2.html?tag=st.next
 
 After Jobs' presentation, Apple Senior Vice President Phil Schiller 
 addressed the issue of running Windows on Macs,
 saying there are no plans to sell or support Windows on an
 Intel-based  Mac.
 That doesn't preclude someone from running it on a Mac. They
 probably  will, he said. We won't do anything to preclude that.
 However, Schiller said the company does not plan to let people run
 Mac  OS X on other computer makers' hardware.
 We will not allow running Mac OS X on anything other than an Apple 
 Mac, he said.
 
 
 Shades of Sony...
 
 

Bon Voyage! ;-) (Thanks for the quote though.) We will see...
iTunes/iPod for windows anyone? How long ago was it that they said they
weren't moving to Intel? The market has a funny way of dictating what a
company will and won't do, no matter how pouty the President.

Make me a believer...

http://danconia.org



Re: CamelBones on Intel? Maybe not.

2005-06-06 Thread Wiggins d'Anconia
Ian Ragsdale wrote:
 On Jun 6, 2005, at 5:18 PM, Joel Rees wrote:
 
 Jobs is insane.

 
 I'm not so sure about that.  IBM seems unwilling or unable to produce 
 mobile G5s, which is a market that Apple considers very important.  
 They also are 2 years behind schedule on 3.0Ghz G5s, and appear to be 
 focusing on video game processors instead of desktop and mobile 
 processors.
 
 Apple might be OK in a speed comparison right now (on desktops, they 
 are clearly losing in laptop comparisons), but how about in two  years? 
 Perhaps IBM has told Apple that they won't attempt a laptop  chip, since
 the volume is way higher for video game consoles?  What  should Apple do?


They should have released Mac OS X for Intel as soon as they had it
ready. Why wait? It seems Apple is too caught up in their own keynotes
to understand volume sales. One thing M$ was definitely *always* better
at. IBM will probably laugh this one to the bank, not exactly going to
put a dent in that $99 billion in revenue...

 Personally, it looks like it will be a bit painful for a few years,  but
 a far better move in the long run.
 

Unless they become just another cheap clone maker with a pretty software
interface. (Did I hear someone say Sun?)

 Ian
 

http://danconia.org


Re: Installing WebService::GoogleHack

2005-05-17 Thread Wiggins d'Anconia
Lola Lee wrote:
 Morbus Iff wrote:
 
 
  /Library/WebServer/Documents/GoogleSearch.wdsl

 
 When I ran this again, it died with this message:
 
 Illegal WSDL File Location - /Library/WebServer/Documents/GoogleSearch.wdsl


The test is trying to open the file just to test for existence,
readability, etc. (not sure why Perl ops couldn't be used rather than
opening and then closing, yikes) but you might want to hack the file
t/1.t and add $! to the error message to see why it is failing. Could be
any number of reasons, I assume the file is readable by your user, etc.
but $! will tell us why it is failing.

http://danconia.org

 
 
 Next time, I left off GoogleSearch.wdsl and it died again, I got this:
 
 Can't locate object method new via package WebService::GoogleHack; at
 t/1.t line 85, STDIN line 5.
 # Looks like you planned 2 tests but only ran 1.
 # Looks like your test died just after 1.
 t/1dubious
 Test returned status 255 (wstat 65280, 0xff00)
 DIED. FAILED tests 1-2
 Failed 2/2 tests, 0.00% okay
 Failed Test Stat Wstat Total Fail  Failed  List of Failed
 ---
 
 t/1.t255 65280 23 150.00%  1-2
 Failed 1/1 test scripts, 0.00% okay. 2/2 subtests failed, 0.00% okay.
 make[1]: *** [test_dynamic] Error 2
 make: *** [test] Error 2
   /usr/bin/make test -- NOT OK
 Running make install
   make test had returned bad status, won't install without force
 
 
 I do have the file in the Documents folder.
 


Re: sendmail question

2005-03-14 Thread Wiggins d'Anconia
Matt Doughty wrote:
On Wed, Mar 09, 2005 at 09:42:00AM -0800, Ted Zeng wrote:
Hi,
When  I used perl on Windows, I used Mail:sender module to send emails,
with attachments.
Now I realized that Mac OS X doesn't have this installed (I installed 
it on Windows myself)
and it has sendmail as a UNIX tool, which can be an option.

My question is:
Do you use sendmail to send emails in your perl tools?
Or use a Perl email module to send emails? Which way you prefer?

So you have heard one position on this subject. I'll give you the other.  
Using the command line sendmail client gives you queueing if the SMTP 
server you are talking to is down, or temporarily unreachable. I'm not 
certain if there is a module out there that will use the sendmail command 
line client directly, but this is definitely the way to go if you don't 
want to lose mail, and you don't want to worry about queueing yourself.

--Matt

Ok, so backing up a step. The key here is that there are two steps to 
the process.

1. Build the message (probably in memory)
2. Send the message
For #1 you absolutely want to use a module. Period. There are far, far, 
far too many intricacies of the message format protocol to try to do 
this by hand. And you *don't* want to rely on the log messages of any 
smtp client/server to find problems in your message building. And the 
minute you get into including attachments, you have really screwed 
yourself. Take a look at the documentation and structure of Mail::Box if 
you think e-mail done right is easy. The module maybe overkill for a lot 
of applications, but it is about as thorough as you can get. There are 
many modules that will build a correct message.

For #2 it matters less. Because #1 and #2 are usually intertwined most 
modules that provide #1 will provide #2 too. And because talking to 
sendmail at the command line can get very hairy very quickly you are 
still better off letting a module do it. The interface has been designed 
for ease of use, the module has (hopefully) been tested, possibly on 
many platforms, and most provide an option to set which local mail 
client/server you wish to talk to. So most can handle #2 using postfix, 
sendmail, etc. Net::SMTP is probably one of the ones that can't, but 
then you wouldn't want to build a message with it anyways.

I spent 2 years working on an application that 90% of the time was 
dealing with mail inbound or out, you need to be an absolute expert in 
mail handling (which I am not by a long stretch!) to do so directly.

http://danconia.org


Re: First CGI Setup

2005-03-11 Thread Wiggins d'Anconia
Chris Devers wrote:
On Sat, 12 Mar 2005, Joel Rees wrote:

(One of these days I'm going to get version control running to my 
liking, and I'll keep everything under /etc in version control. For 
now, I just make a copy to work on and rename the old one *_nnn.bak or 
something, keeping track of the editing sequence in the _nnn portion.)

Try this:
$ cat ~/bin/stamp
#!/bin/bash
#
# stamp is a utility which makes a backup of a conf file
[ $# -ne 1 ]  echo usage: `basename $0` filename  exit 100
old=$1
new=$1.`date +%Y%m%d`.$$
[ ! -f $old ]  echo $old does not exist  exit 100
cp $old $new
status=$?
[ -x $new ]  chmod -x $new
exit $status
$
It's crude, but it works well enough.
$ cd /etc/httpd
$ sudo stamp httpd.conf
  # I get a file like httpd.conf.20050311.15629.
$ vim httpd.conf  apachectl configtest
  # I make a royal mess of things. Damn.
$ cp httpd.conf.20050311.15629 httpd.conf
$ apachectl configtest
  # All is right with the world again.
Something like CVS / SVN / BitKeeper would be better, but not easier.
 


Nice. You could also just op for RCS until you can get a more managed 
solution up, it works well enough for this type of thing and only (or 
not even) requires an RCS directory be created, and for you to follow 
the standard procedures,

 co -l filename
... make edits
 ci -u -M filename
Pretty simple stuff. It appears rcs comes with Mac OS X, or at least 
when the dev tools are installed.

 man rcs
For more info.  The others are excellent, but they can be overkill for 
simple, non-distributed, configuration files.

http://danconia.org


Re: sendmail question

2005-03-09 Thread Wiggins d'Anconia
Ted Zeng wrote:
Hi,
When  I used perl on Windows, I used Mail:sender module to send emails,
with attachments.
Now I realized that Mac OS X doesn't have this installed (I installed it 
on Windows myself)
and it has sendmail as a UNIX tool, which can be an option.

My question is:
Do you use sendmail to send emails in your perl tools?
Or use a Perl email module to send emails? Which way you prefer?
ted zeng
Adobe Systems

Use a module. Several of them can send via sendmail, but regardless let 
them do the message construction, it will save you many headaches in the 
end.  There are a number of them on CPAN, each with its own features and 
pecularities.

http://danconia.org


Re: What Perl editor do you recommend?

2005-03-02 Thread Wiggins d'Anconia
John Delacour wrote:
At 9:45 pm + 2/3/05, Phil Dobbin wrote:
I'm thinking that if he's not comfortable with pico maybe emacs is not 
the best idea...

I'd love to hear a convincing explanation from someone why anyone would 
use such tools in preference to TextWrangler, BBEdit or Affrus. I can 
imagine they'd make it a chore to write code in us-ascii and either a 
nightmare or an impossibility to deal with non-ascii, but maybe that's 
because I'm just an unreformed Mac user :-)

JD

They aren't free (well BBedit and Affrus), they aren't cross platform 
(why learn a different editor for each platform), and they require lots 
of clicky.

I have never logged into a system where I couldn't use vi. (well maybe a 
windows box, but it didn't take long to install gvim or cygwin.)

http://danconia.org


Re: TextWrangler

2005-01-22 Thread Wiggins d'Anconia
Joel Rees wrote:
While we're playing around with Editor Wars...
there's no need for that sort of language...
Boy,, there's nothing like a good old-fashioned editor war!
But this one doesn't seem to have much punch to it. More like a dust 
devil than a cyclone.

Vim.
http://danconia.org


Re: Net::SSH::Perl

2004-09-15 Thread Wiggins d Anconia
 I can't get Math::Pari (dependency for Net::SSH::Perl) fails to build 
 on OS X 10.3.5.  Suggestions?
 
 Thanks!
 
 

I had trouble getting it to build on Solaris 8. I was successful in
getting a downgraded version installed and it has worked (almost
flawlessly).

Math-Pari-2.010305  is the version I used.

The one flaw that I know of happens when I use Net::SFTP (wraps
Net::SSH::Perl) and try to leave the connection up constantly, making
lots of directory listings every minute or so (aka 50+ directories
listed every minute), after about 2.5 hours Math::Pari throws a memory
error and crashes the whole program.

I haven't tried upgrading recently (install was done in 05/2003).  My
version of Pari is 2.1.5, and in trying to track it down I noticed this
page, which you might try:

http://www.boksa.de/tutorials/pari_macosx.mpp

HTH,

http://danconia.org


Re: n00b needs help with file arguments

2004-06-02 Thread Wiggins d Anconia
 I think you are missing the point.  If a beginner knew the *right*
 place to find information, he would not be a beginner.
 

It takes more than knowing where to look to not be a beginner. I
generally know where to look and compared to many on this list and
others I am definitely very much a beginner.  

 Meditate on this and think about the time when you were a beginner
 and how someone helped you.
 

Right, I have, and I did the same as they did, pointed me to the correct
location. That is why I said that the beginners list was a better place
and stated why, I didn't just say get the hell out of here you newbie.

 Meditate on how the world would be if people in other professions did
 the same thing you do.  For example, you go into a doctor's office and
 he gives you a lecture on how you need to find the appropriate medical
 resource and tells you to go to the library and look it up.
 

Actually when I go into a doctor's office I very often do get referred
to a different location, specifically to the correct specific resource
for what I am after, just because it doesn't take the form of a library
doesn't mean it isn't the same thing.  I doubt seriously you ask a
cardiologist about back pain??

http://danconia.org

 Joe.
 
 
 On Jun 1, 2004, at 12:02 PM, Wiggins d Anconia wrote:
 
  I heartily agree.  People should be helpful.  In the old days of the
  internet
  this was more common.  Being kind is what separates a prince from a
  rouge.
 
  Besides the documentation is obscure and lacking and not easy to know
  where it is found.
 
 
  True, however this is a list for Mac OS X related questions, not newbie
  Perl questions. The [EMAIL PROTECTED] is a much better choice, and 
  much
  friendlier when it comes to such questions, as that is its focus.  RTFM
  is a good answer on a topic specific list where it is assumed one
  already knows how to access the M. In the old days of the internet it
  was also more common for one to put in the required amount of effort to
  find the most proper place to ask their question, which is less and 
  less
  common.
 
  A list for beginning Perl programmers to ask questions in a friendly
  atmosphere. - beginners
 
  Discussion of Perl on Macintosh OS X - macosx
 
  http://lists.perl.org
 
  Time to throw this out again:
  http://www.catb.org/~esr/faqs/smart-questions.html
 
  http://danconia.org
 
 
 




Re: n00b needs help with file arguments

2004-06-01 Thread Wiggins d Anconia
  ** This does not apply to you Sherm **
 
  I asked a question about quoting an argument prior to removing a 
  directory
  to learn something about how the language handles things. I don't know 
  if
  the rmdir method is recursive and/or forced in Perl.. Telling me to 
  try and
  see is a BS answer to give. Of course I could create a directory and 
  try,
  but what is the need for a community of experts if I can't ask a 
  simple
  question?
 
  Also... Read perldoc this and that... Most people new to a language 
  will
  have no clue what you are talking about. Keep this in mind for the 
  future.
  Do you tell a person who can't drive standard ... Hey it's a car. Just
  drive it!? Of course not. If you don't have the time to provide a 
  proper
  answer, or some code to help explain things, please don't waste your 
  time
  replying.
 
 
 I heartily agree.  People should be helpful.  In the old days of the 
 internet
 this was more common.  Being kind is what separates a prince from a 
 rouge.
 
 Besides the documentation is obscure and lacking and not easy to know
 where it is found.
 

True, however this is a list for Mac OS X related questions, not newbie
Perl questions. The [EMAIL PROTECTED] is a much better choice, and much
friendlier when it comes to such questions, as that is its focus.  RTFM
is a good answer on a topic specific list where it is assumed one
already knows how to access the M. In the old days of the internet it
was also more common for one to put in the required amount of effort to
find the most proper place to ask their question, which is less and less
common.

A list for beginning Perl programmers to ask questions in a friendly
atmosphere. - beginners

Discussion of Perl on Macintosh OS X - macosx

http://lists.perl.org

Time to throw this out again:
http://www.catb.org/~esr/faqs/smart-questions.html

http://danconia.org



Re: Copying files

2004-04-28 Thread Wiggins d Anconia
 
 On Apr 28, 2004, at 11:48 AM, Mark Wheeler wrote:
 
  Hi Ken,
 
  I switched that because it was suggested that writing files over 
  system boundaries might be a problem. What is the difference between 
  the two? It sounds like, from your comment, that they do very 
  different things.
 
 Yeah - rename() moves the file, and copy() makes a copy of it.
 

If we are going to get technical we might as well go further, rename()
changes link information in the inode which is why it doesn't work
across filesystem boundaries, copy() copies a file, then there is move()
which moves a file, doing a rename if possible, failing that then it
does a copy and unlink, etc.

Which is why I suggested File::Copy it provides copy() and move(),
assuming the docs got read then Mark could decide between his subject
line Copying files and his code rename ...

http://danconia.org



Re: Copying files

2004-04-27 Thread Wiggins d Anconia
 Hi,
 
 Thanks all for the help on the mail question a few days back. That's 
 fixed. Now I've run into another problem. I'm trying to copy a file on 
 a local network (off a PC) to my Mac. But when the script is called 
 from within cron, it seems that the script doesn't run. The cron looks 
 like this:
 
 * * * * * /Users//Library/Scripts/backup.pl
 
 The script is as follows:
 
 #!/usr/bin/perl -w
 
 use strict;
 
 my @files = db1.txt, db2.txt, db3.txt, db4.txt;
 
 foreach (@files) {
 rename /path/to/pc/file/$_, /Users//Documents/$_..bak;

You should always check that 'rename' succeeded (as apparently it isn't).

rename $oldfile, $newfile or warn Can't rename file: $!;

$! will then provide you more information as to why its not working.

 }
 
 exit;
 
 And finally, should I even use perl to do this? I'm comfortable with 
 the little perl I know, but should I use some sort of bash file -- I've 
 never messed with bash before, but maybe now is a good time to learn. I 
 don't even know if I am referencing bash correctly here.
 

Using Perl is fine, but you want something other than 'rename'.  I
suspect this is failing because you are attempting to move a file across
a filesystem boundary, from perldoc -f rename:

For example, it will usually not work across file system boundaries,
even though the system mv command sometimes compensates for this.

I would suggest File::Copy instead though make sure to check the docs
and test, I don't have my laptop to check for issues on OS X:

perldoc File::Copy

HTH,

http://danconia.org


Re: Simple perl script send email

2004-04-25 Thread Wiggins d'Anconia
Mark Wheeler wrote:

Hi,

I just installed 10.3 and am trying to get a cron job to fire off a perl 
script which will send an email saying the cron job was completed.

crontab listing

* * * * * /Users/blah/Library/Scripts/test.pl

Here is the script:

test.pl

#!/usr/bin/perl -w
use strict;

my $from = '[EMAIL PROTECTED]';
my $to = '[EMAIL PROTECTED]';
my $body = Success.;
open (SENDMAIL, | mail -t);
Check that open succeeded, and use a full path to 'mail', especially 
since under cron your PATH may be different/restricted.

open (SENDMAIL, | /usr/bin/mail -t) or die Can't pipe to sendmail: $!;

Having said that, I would suggest not using mail directly at all, 
instead install a mail handling Perl mod from CPAN, there are lots of them.

print SENDMAIL Subject: Backup Email Test\n;
print SENDMAIL From: $from\n;
print SENDMAIL To: $to\n\n;
print SENDMAIL $body;
close (SENDMAIL);
exit;
--
I have enabled Postfix to be running and have sent and received an email 
from the command line. I've also executed the script run from the 
command line. But the script doesn't seem to be sending an email. Do I 
need to get perl set to run in a settings file? I thought that I only 
needed to mess with settings files if I was going to use the web server. 
A little help would be appreciated.

HTH,

http://danconia.org


Re: Simple perl script send email

2004-04-25 Thread Wiggins d'Anconia
Mark Wheeler wrote:

Thanks. I'll give it a try. That makes sense. When you are talking about 
mail handling Perl mod, you are talking about NET::SMTP or something 
like that, right? Also, why would you not want to use mail directly?

Mail is an incredibly complex thing, combine that with trying to handle 
IPC issues when shelling out, then you are reinventing a wheel that 
should definitely not be re-invented.   Net::SMTP is an example, though 
probably a more difficult one, there are lots,

http://search.cpan.org/modlist/Mail_and_Usenet_News/Mail

Mail::Mailer
Mail::Sender
MIME::Lite
Are some good choices, I use Mail::Box but generally it is way overkill, 
but since I know I will have it installed I usually default to it.

Generally using a module will be less error prone, easier to maintain, 
and more portable.

http://danconia.org

Thanks,

Mark

On Apr 25, 2004, at 2:53 PM, Wiggins d'Anconia wrote:

Mark Wheeler wrote:

Hi,
I just installed 10.3 and am trying to get a cron job to fire off a 
perl script which will send an email saying the cron job was completed.
crontab listing
* * * * * /Users/blah/Library/Scripts/test.pl
Here is the script:
test.pl

#!/usr/bin/perl -w
use strict;
my $from = '[EMAIL PROTECTED]';
my $to = '[EMAIL PROTECTED]';
my $body = Success.;
open (SENDMAIL, | mail -t);


Check that open succeeded, and use a full path to 'mail', especially 
since under cron your PATH may be different/restricted.

open (SENDMAIL, | /usr/bin/mail -t) or die Can't pipe to sendmail: 
$!;

Having said that, I would suggest not using mail directly at all, 
instead install a mail handling Perl mod from CPAN, there are lots of 
them.

print SENDMAIL Subject: Backup Email Test\n;
print SENDMAIL From: $from\n;
print SENDMAIL To: $to\n\n;
print SENDMAIL $body;
close (SENDMAIL);
exit;
--
I have enabled Postfix to be running and have sent and received an 
email from the command line. I've also executed the script run from 
the command line. But the script doesn't seem to be sending an email. 
Do I need to get perl set to run in a settings file? I thought that I 
only needed to mess with settings files if I was going to use the web 
server. A little help would be appreciated.



Re: Archive::Zip

2004-03-18 Thread Wiggins d Anconia


 Hi all -
 
 I just put together an application using Archive::Zip that intercepts 
 emails sent to an address on my MacOS X 10.2.8 system. It works just 
 fine when the files are sent from another Mac using the built-in 
 archive command. However, my client is using WinZip from his system and 
 it's going through an MS Exchange server en route to my Mac server and 
 an error occurs when the email is processed. Is there something 
 particular to WinZip vs. Mac's archive command that is causing the 
 error below? Is there a better way of using Archive::Zip than the way I 
 have it in the code below? I'm digging through the Archive::Zip docs, 
 but nothing is jumping out at me.

The error you are getting isn't related to Archive::Zip it appears to be
related to the message parsing, MIME::Entity/MIME-Tools maybe?  You
haven't shown us code for Archive::Zip other than the use, at least that
I can tell.

 
 # begin returned message from email
 - The following addresses had permanent fatal errors -
 |/Users/test1/bin/mailExpenses.pl
  (reason: 255)
  (expanded from: [EMAIL PROTECTED])
 
 - Transcript of session follows -
 Can't call method path on an undefined value at
 /usr/adm/sm.bin/mailExpenses.pl line 94.
 554 5.3.0 unknown mailer error 255
 # end returned message from email
 

This indicates that 'bodyhandle' is returning undef below.  

 
 # begin section of code where error occurs
 use Archive::Zip qw(:ERROR_CODES :CONSTANTS);
 # loop through the file attachments
 for ($i = 0; $i  $num_parts; $i++) {
   $part = $entity-parts($i);
   $content_type = $part-mime_type;
   $body = $part-as_string;
   $bh   = $part-bodyhandle;
   $filename = $bh-path; # line 94
 # end section of code where error occurs
 

I would suggest checking out the MIME headers for the messages for the
different sending servers, Exchange could definitely be using different
headers.  Can you show us more message parsing code? And is the
attachment getting nested?  A cursory glance at the docs suggest this
might be a 'message/' versus 'multipart/' issue in how Exchange builds
its headers. I am also not sure if you can call 'path' on what
'bodyhandle' returns, from the docs it indicates that it contains the
data directly rather than a file/path.

HTH,

http://danconia.org



Re: Web servers with cable DSL

2004-03-17 Thread Wiggins d'Anconia
Bill Stephenson wrote:
Well,

I think that Kevin (morbus) really did a good job of pointing out why I 
can't entirely do this yet. Some of the sites I host are critical to the 
businesses that use them and Verio has always provided a great service. 
Because they host on FreeBSD, developing on the Mac and porting to Verio 
is almost seamless even though Verio has never done anything special to 
accommodate this.

However, the fact that so many on this list are hosting sites with cable 
DSL indicates that I can possibly move some of the sites I host to a 
home office based server and still save a little money. I'll spend some 
time reviewing the sites and costs and see how the numbers crunch.

What about using http://directv.direcway.com/ to host servers? Anyone 
doing that?

Just my $.02, I host development at home over DSL and it is sufficient 
for development purposes. But have found hosting cheap enough for the 
features I must have to warrant it, and I don't have to worry about 
power failures (long ones), backups (as many), support, etc.

Direcway is rumored to have incredible upward latency I would think 
hosting would be the last thing (next to hosting games) that you would 
want to do over their service.  There was a fair amount of discussion of 
it on a /. story sometime in the last couple months.

Using dyndns, hosting on linux over ameritech DSL.

http://danconia.org


Re: ANNOUNCE: Affrus 1.0 - a Perl Debugger

2004-03-11 Thread Wiggins d Anconia
  Chris == Chris Nandor [EMAIL PROTECTED] writes:
 
 Chris I was the one who started this mailing list.  There is no admin
 Chris of this specific list, last I checked.
 
 Chris I fully support this posting to this list.
 

 
 Without such a stand, we enter a slippery slope.  It's the same reason
 I permit *no* commercial postings to comp.lang.perl.announce.
 

This makes an interesting problem, if he can't post an announcement to a
general discussion group *AND* he can't post an announcement to
'announce' just because it is commercial... 

I think it difficult to call single announcements a slippery slope.  So
is there a perl-commercial-product list, or
perl-commercial-product-announce list?  What is the traffic on the
'announce' list as it stands now?  A list has a bigger chance of dieing
if there is no traffic than one that gets a few on-topic commercial
postings.

Though I will agree I didn't like the shameless plug tone, the hey
guys check this out I think it is cool tone usually goes over better in
the geek crowd...

http://danconia.org 





Re: tricky parsing question

2004-01-23 Thread Wiggins d Anconia


 On Thu, 22 Jan 2004, wren argetlahm wrote:
 

snip

 
 Maybe Parse::RecDescent? Maybe I'm over-thinking this...
 
 

This is what I thought of immediately, an old but excellent article
maybe a good place to start:

http://search.cpan.org/src/DCONWAY/Parse-RecDescent-1.94/tutorial/tutorial.html

http://danconia.org


Re: xterm color support

2004-01-23 Thread Wiggins d Anconia
 Oddly enough I was just looking into ANSI escapes for the mud I was 
 banging on about in another thread and came to the conclusion Why use 
 a module when you can just pepper your text with ANSI escape codes? :
 

Readability?  Why use a module for anything?

 
 Note the implmentation of ANSI varies wildly, so expect colour and 
 (possibly) formatting but not consistency. Especially from Mircrosoft 
 :).
 

Another good argument for using the module...

http://danconia.org


Re: xterm color support

2004-01-23 Thread Wiggins d Anconia


 
 On Saturday, January 24, 2004, at 12:18  am, Wiggins d Anconia wrote:
  Readability?  Why use a module for anything?
 It's not exactly rocket science to understand the escapes and 
 seriously, using a print for each and every escape code format ? 
 Expensive much? And what about situations where you can't use print 
 like a multiplexing socket?
 

Not rocket science, and I didn't mean the readability of the Perl, I
meant the readability of your strings.  I would rather see: red 'This is
red text' then '\e[31mThis is red text\e[0m'.  Not to mention I am not
into carrying around lists of what exactly 31 maps to, I have more
important uses for brain cells, like killing them with beer :-)...
Expensive is hardly a discussion to get into on a Perl list, Assembly
much?  Interesting about the socket, but to me color formatting using
ANSI escape sequences over a socket is probably a little overdone, talk
about expensive.  Of course I am color deficient and work with a
colleague who is actually color blind, so other than syntax highlighting
in code I really don't care much for colored text, and even in the
syntax coloring area I couldn't tell you what some of the colors are and
don't pay attention to what they mean, it is the 'difference' in color
that matters to me

 Modules are good, but then sometimes the price they ask is too high for 
 what they actually do. IMHO of course.
 
  Note the implmentation of ANSI varies wildly, so expect colour and
  (possibly) formatting but not consistency. Especially from Mircrosoft
  :).
  Another good argument for using the module...
 It's not magic, it has no silver bullet to ensure every terminal will 
 show the same colours/formats, simply because the module uses the same 
 escape codes I listed in my earlier posting, being as they are ANSI 
 standards :).
 

The point is that when the silver bullet is invented/discovered your
code won't be using it, but mine will, with one quick upgrade to one
piece of code

Modules can be overkill, granted, to each their opinion and every
project is different

http://danconia.org


Re: advanced stdout/stderr capturing?

2003-02-14 Thread Wiggins d'Anconia
Wiggins d'Anconia wrote:



I can't remember

completely whether you can use it outside of the rest of the POE 
environment or not

Nope can't, Unlike Components, Wheels do not stand alone. Each wheel 
must be created by a session, and each belongs to their parent session 
until it's destroyed.

But still have a look...

http://danconia.org



Re: konfabulator -- something to ponder

2003-02-12 Thread Wiggins d'Anconia


Puneet Kishor wrote:

 Some time in early 2000 Arlo Rose came up with an idea for a cool
little application. It would use XML to structure images, and a
scriptable language, like Perl, in such a way that someone who knew
the basics of Perl could put together cool little mini-applications.
The goal was that these mini-applications would just sit around on
your desktop looking pretty, while providing useful feedback.

All he ever really wanted was to have a cool looking battery monitor
and something that told him the weather, but he knew the
possibilities for something like this could potentially be limitless.

Fast forward a couple of years when Arlo began working with Perry
Clarke at Sun Microsystems. Over lunch one afternoon Arlo gave Perry
the basics of this dream app. Perry suggested that JavaScript would
be far easier for people to digest. He was right. It's the basis for
Flash's ActionScript, and Adobe's scripting engine for Photoshop. Of
all the choices, JavaScript made the most sense. Shortly after that
lunch, the two began to spend their nights and weekends making this
thing a reality.

A half year later Konfabulator is ready for download, and now it's
up to you to see if they were right about how cool this thing can be!




http://gkrellm.net anyone ;-) (granted it isn't javascript) ...sorry 
that must be my linux background showing through again...back in the 
cage you nasty little penguin

http://danconia.org



Re: Search for a string

2003-02-04 Thread wiggins

On Tue, 4 Feb 2003 00:14:30 -0900, Dennis Stout [EMAIL PROTECTED] wrote:

 Sheesh.
 
 Wil ya'll just help a man with a perl problem instead of battering him with
 other ways to do it?


At least one of these lists is a beginners list, the other is related to Mac OS X 
which the question really wasn't (other than I figure he is running Apache on OS X), 
this is not a contract out free coding list, my original post suggested a better 
overall system, that is to split the log with Apache (as it is easier to put them back 
together again than to take them apart).  My post also offered several suggestions of 
how the task would be best handled within a perl script.
 
 Sometimes people like ot pose a challenge to themselves and see if it can be
 done.
 

Right, which is what my last paragraph alluded to, I suggested splitting on each of 
the lines and catching the vhost, then printing that line to a separate file, but 
without first trying and then asking further questions about how to do this I am not 
going to offer up a solution, as that would defeat the purpose of a challenge.

 Instead of being counterproductive and refering peopel to other things, help
 the man!
 

See above. This message hasn't been all that productive for the original poster, and 
while I am not easily discouraged, this could suggest that the help offered (myself 
and the others) (freely - beer and speech) was not appreciated and may keep other 
posters from offering advice on how best to do something rather than on how a poster 
*is* doing something, which would be the most counterproductive.

http://danconia.org



Re: Search for a string

2003-02-03 Thread wiggins


On Mon, 03 Feb 2003 13:09:47 -0500, Jeremy Schwartz [EMAIL PROTECTED] wrote:

 Not trying to reinvent the wheel.
 
 I am using Analog for the analysis.
 
 I am trying to split the server combined log into individual vhost logs. I
 can then run each through Analog to produce individual reports.

  Don't reinvent the wheel.  There are a number of fine log analysis
  utilities, such as analog.
  
  xoa

Out of curiousity is there a reason why you are not handling this at the Apache level? 
 Each vhost can have its own set of logs at the start that then would not need to be 
pulled apart.  Is this a possible scenario for you going forward? (granted it doesn't 
help now).  It would seem that your task would be better handled with shell script 
possibly since you already have the command line for creating the file(s) from the 
main log, so then just wrap that command in a foreach that takes your directory names 
as input. 

Something along the lines of:

#!/bin/sh

for dir in `ls -1 /webroot/`; do
  cat /var/log/httpd/access_log | grep $dir 
/var/log/httpd/access_log_$dir
done

I am no shell hacker and the above is untested, but you get the idea.  In general Perl 
would not be a good choice for performing something so simple that already has a 
command line solution available. 

If you were going to do it in Perl, rather than looking for each vhost in the log 
file, you would be better off unpacking or splitting, etc. the log line and storing 
that line to an array that is associated with the particular vhost in the log line and 
then printing each vhost's array to a file, or you would have to open a filehandle for 
each vhost at the beginning of the script and then just print the line to whichever 
filehandle is associated with a particular vhost.  Stepping through every line of the 
log file foreach of the vhosts in Perl would probably be a really bad way to handle 
things.

I would still suggest letting Apache do the splitting by not storing one main log with 
all vhost content, it is much easier to put the logs back together to get a complete 
picture than it is to disect them after the fact.

http://danconia.org



Re: Apache sessions

2003-01-31 Thread wiggins


On Fri, 31 Jan 2003 11:10:00 +0100, Florian Helmberger [EMAIL PROTECTED] wrote:

 On Donnerstag, Jänner 30, 2003, at 11:57  Uhr, Jeff Kolber wrote:
 
snip

 I wouldn't attach a session to an IP address as it is quite common that 
 a visitor's IP address changes frequently if he's behind some load 
 balanced proxies (I debugged the resulting strange behaviour for 
 quite a long time a few years ago).
 

Or in the other direction, where you have multiple users coming in from the same 
address from very different locations through the same proxy. This is very common if 
your site is at all geographically relevant, or if you have a lot of AOL users, or if 
your site is at all interesting ;-) and say it might be shared among co-workers or 
students at the same organization...

http://danconia.org



RE: Mission Critical Perl

2002-12-06 Thread wiggins


On Fri, 6 Dec 2002 00:03:43 -0700, John-Michael Keyes [EMAIL PROTECTED] wrote:

 
 Awright, I'm gonna float this one, even if it == $off_topic++;
 
 And I float to this group first, because I'm writing mostly 
 ActiveState-deployed Perl on a pilfered Mac running Jag.2.2 and a 
 vanilla Perl install in an MSUBERALL shop. (And BBEdit is still the 
 envy, and bane, of my Windows brethren: Regex this... studio-boy!) 
 Yes, my days are weird, but entertaining.
 
 So the reality that I live with is:
   a. Perl is the Language of Last Resort culturally in the 
 organization.
   b. Despite the fact that Perl is running mission-critical in EVERY 
 aspect of our business.
   c. And because of (a.) MANAGEMENT doesn't know how much (b.) is going 
 on.
   d. if this $ENV is atypical, let me know ([EMAIL PROTECTED]), and stop 
 reading, but if this resonates, maybe a Barnumesque maneuver is 
 mandated...
 
 Because, I think there is a growing perception out there that is 
 depreciating the value of Perl, and we need to counterfud/act it (I'm 
 still not sure which) now or soon.
 
 My notion is that collectively, we add some vocabulary to the gestalt 
 of development... Maybe we call it Mission Critical Perl. You know, 
 it's not com, it's dotnet (I mean .NET). So, it's not Perl it's .MCP.
 
 What is Mission Critical Perl?
   It's code that gets the job done.
   It's strongly commented, both in architecture and execution.  (Yeah, I 
 know. Perl is self commenting. Comment more anyway dammmit.)
   It logs actions. (No matter how trivial, without metrics - it ain't 
 important. This is essential, if EVERY script you've written isn't 
 logging, write a .pm, go back and include it. We can't show 'em how 
 much we're doing with Perl unless we have a bar graph.)
   It's code reviewed.
   It's heralded: Mission Critical Perl. Maybe even .MCP
   
 ..MCP When what you need is: RAD, Reliable, Reporting and Robust.
 
 So there's my teaser. If I'm alone, cool. I've been meaning to polish 
 the bytes on my resume. But if you're in the same scenario, let's start 
 a movement.
 
 JMK
 
 PS. Tim O., Think about A Manager's Guide to Leveraging Corporate Perl 
 Assets, the Killer Whale Book.
 

soapbox
I would say you are correct about the general attitude towards Perl and the number of 
environments similar to yours is probably staggering, I am in one.  However I disagree 
with the actions you suggest and whether any need be taken at all.  Adding just 
another acronym that means nothing, and adds no value turns Perl into the very thing 
those other tools/languages, etc. have turned into. I also feel that the very reason 
so many of us love Perl is because it does Just get the job done, NOW, but if you 
add a bunch of vaporware marketing then it becomes some tangible item that the 
business side (yeh I know, but I am not just some dumb developer, like so many of us I 
have a business degree) throws around over their bagel at starbucks and then we spend 
hours in meetings talking about the latest innovations in MCP (which I think is 
already taken, M$ Certified Pro or something dumb like that) rather than doing the 
real work we have been doing while our VB buddies go and talk about w!
hatever the new project (of course starting with those 4 little letters, Open) is 
going to waste time and money for our companies today.  I agree with the need to 
express to our colleagues (not peers) the benefits of Perl and that it is 
*Enterprise*, *Mission Critical*, *Metric Oriented*, etc. but I also feel that there 
is a flight to quality happening right now and Perl will be embraced on its merits not 
on some conjured hype.  I think the same goes/went for unix, just because M$ owns the 
desktop we all know the majority of the mission critical stuff is being done on unix, 
and right now there is a flight to quality going on, and it is a movement from windoze 
to Linux and OS X, (Open|Free)BSD, etc. After reading it, I think that this quote from 
drieux sums things up nicely:

Perl is required when management is confused about the goals.

But I could be wrong. ;-)
/soapbox

I think the biggest reason Perl faces what you suggest is because it was so 
overwhelming adopted by the CGI/Web community, and no one does anything *important* on 
the web ;-).

http://danconia.org



Re: Math::Pari -- anyone using it on MacOS X?

2002-12-03 Thread wiggins
Convenient that I just ran into this problem. It appears that the module will attempt 
to download the source for PARI if it can't find it, but it appears that the module 
can't find that source from the PARI site.  As to why I have no idea, but if you go to 
the ftp site manually (ftp://megrez.math.u-bordeaux.fr/pub/pari/unix/) and download 
the source tar ball and then unpack it somewhere near where your perl is installed it 
will find it and build the module for you.  For instance I was building my modules in 
my home directory and installing them to ~/rearch/lib, I unpacked PARI in my home 
directory ~/pari-2.1.4 and it successfully found the headers it needed and installed 
properly.

As to whether this will work on Mac OS X I can't say, I was installing on a solaris 
box and my poor little lap top is without a net connection right now so I can't test 
it.  Mac OS X is BSDish enough that there is a good chance it will work, but the 
readme in the Mac dir at the FTP site says Macs are no longer maintained, whether this 
includes X is probably up in the air.  However it is my understanding that it is only 
using the headers as reference material so there is a chance perl will be smart enough 
to correct itself for the X platform

Good luck,

http://danconia.org



On Tue, 3 Dec 2002 10:34:18 -0600, Christopher D. Lewis [EMAIL PROTECTED] 
wrote:

 
 On Monday, December 2, 2002, at 10:59  PM, Ken Williams wrote:
  On Monday, December 2, 2002, at 04:47  PM, Christopher D. Lewis wrote:
  On Sunday, December 1, 2002, at 05:49  PM, Ken Williams wrote:
  The error messages below aren't helpful, they just say that there  
  were error messages in a previous run.  In the CPAN shell, do 'clean  
  Math::Pari' and then 'test Math::Pari' to see the real error  
  messages.
  This may look lame, but I am new enough to lack even rudimentary  
  troubleshooting (besides installing modules when an error says can't  
  find Module X.  The error I get when followign your prescription is:
  [looking good up to ...]
  Getting GP/PARI from ftp://megrez.math.u-bordeaux.fr/pub/pari/unix/
  Cannot list ():  at utils/Math/PariBuild.pm line 167,  line 1.
  Running make test
Make had some problems, maybe interrupted? Won't test
 
  I still don't understand your message - what do you mean [looking  
  good up to ...]?
 
 Sorry, the output from the command to test follows.
 Many thanks for looking,
   Chris
 
 ---begin copy---
 cpan test Math::Pari
 Running test for module Math::Pari
 Running make for I/IL/ILYAZ/modules/Math-Pari-2.010305.tar.gz
 Checksum for  
 /Volumes/Storage/cpan/sources/authors/id/I/IL/ILYAZ/modules/Math-Pari- 
 2.010305.tar.gz ok
 Math-Pari-2.010305
 Math-Pari-2.010305/utils
 Math-Pari-2.010305/utils/Math
 Math-Pari-2.010305/utils/Math/PariBuild.pm
 Math-Pari-2.010305/utils/paridoc_to_pod
 Math-Pari-2.010305/utils/notes
 Math-Pari-2.010305/utils/README
 Math-Pari-2.010305/utils/inc.h
 Math-Pari-2.010305/utils/chap3_to_pod
 Math-Pari-2.010305/utils/comp_funcs.pl
 Math-Pari-2.010305/utils/foncpari.pl
 Math-Pari-2.010305/typemap
 Math-Pari-2.010305/libPARI
 Math-Pari-2.010305/libPARI/extract_codes.pl
 Math-Pari-2.010305/libPARI/codes_2014
 Math-Pari-2.010305/libPARI/expected_codes
 Math-Pari-2.010305/libPARI/gphelp
 Math-Pari-2.010305/libPARI/Makefile.PL
 Math-Pari-2.010305/Pari.xs
 Math-Pari-2.010305/test_eng
 Math-Pari-2.010305/test_eng/ex.t
 Math-Pari-2.010305/test_eng/Testout.pm
 Math-Pari-2.010305/Makefile.PL
 Math-Pari-2.010305/PariInit.pm
 Math-Pari-2.010305/README
 Math-Pari-2.010305/patches
 Math-Pari-2.010305/patches/diff_2.1.3_interface
 Math-Pari-2.010305/patches/diff_2.2.2_interface
 Math-Pari-2.010305/patches/diff_pari-2.1.3-ix86-divl
 Math-Pari-2.010305/patches/diff_2.1.2_gccism
 Math-Pari-2.010305/t
 Math-Pari-2.010305/t/Pari.t
 Math-Pari-2.010305/t/PlotRect.t
 Math-Pari-2.010305/TODO
 Math-Pari-2.010305/Pari.pm
 Math-Pari-2.010305/MANIFEST
 Math-Pari-2.010305/INSTALL
 Math-Pari-2.010305/Changes
 Removing previously used /Volumes/Storage/cpan/build/Math-Pari-2.010305
 
CPAN.pm: Going to build I/IL/ILYAZ/modules/Math-Pari-2.010305.tar.gz
 
 Did not find GP/PARI build directory around.
 Do you want to me to fetch GP/PARI automatically?
(If you do not, you will need to fetch it manually, and/or direct me  
 to
 the directory with GP/PARI source via the command-line option  
 paridir=/dir)
 Make sure you have a large scrollback buffer to see the messages.
 Fetch? (y/n, press Enter) y
 Getting GP/PARI from ftp://megrez.math.u-bordeaux.fr/pub/pari/unix/
 Cannot list ():  at utils/Math/PariBuild.pm line 167,  line 1.
 Running make test
Make had some problems, maybe interrupted? Won't test
 ---end copy---
 



RE: -d seems to disable rename(old,new)

2002-11-26 Thread wiggins
Why open the filehandle then test the filehandle? I believe you should just test the 
file path with the -d then only open the handle afterwards.

http://danconia.org



On Tue, 26 Nov 2002 13:10:27 -0800, Matthew Galaher [EMAIL PROTECTED] wrote:

 I've written a Dropscript droplet in perl that is meant to rename files 
 based on a specific naming convention. As a safety measure I wish to 
 check if they drop a folder. I check this using if(-d FILEHANDLE){ 
 which works, but it also seems to make the rename fail. What follows 
 is what I hope is the relevant code from the script. If I comment out 
 the part that checks whether it's a directory, the rename function 
 works. Any ideas?
 Thanks in advance.
 
 
 
 open(FILEHANDLE,$test_if_dir);
   
   if(-d FILEHANDLE){#IF IT'S A FOLDER DON'T RENAME IT
   $write = false;
   $because_is_dir = $because_is_dir . Will not rename the following 
 item because it is a folder:\n $full_path_dropped\n\n;
   $print_switch = 1;# if there is a warning this sets it to show report.
   }
   
   else{
   if (($new_path =~ m/NOT FOUND!/)($write eq true)){#don't enter 
 block if found to be directory
   $because_something_not_found = $because_something_not_found . 
Will 
 not rename the following item because some element not found (job 
 number, client name, message type, etc.):\n$full_path_dropped\n\n;
   $write = false;
   $print_switch = 1;# if there is a warning this sets it to show 
 report.
   }
   
   elsif($write eq false){#IF parse_path AND check_dir functions 
 set $write to false then:
   $because_exists = $because_exists . Will not rename the 
following 
 item because a file with that name already 
 exists\n$new_path\n$full_path_dropped\n\n;
   $print_switch = 1;# if there is a warning this sets it to show 
 report.
   }
   elsif ($write eq true){
   $yes = true;
 #THE PART THAT FAILS:
   rename($full_path_dropped,$new_path) or print REPORT 
\n**FAILD TO 
 RENAME FILE**\n;
   $will_rename = $will_rename . path was \n . 
$full_path_dropped . 
 \nnew path would be \n . $new_path . \n\n;
   $print_switch = 1;#comment this out when done. This way report 
won't 
 open if no warnings are written.
   }
 
   else{
   $print_switch = 1;# if there is a warning this sets it to show 
 report.
   }
   }
   close(FILEHANDLE,$test_if_dir);
 



Re: unix or mac-style text files?

2002-11-19 Thread Wiggins d'Anconia
There is some discussion of this issue in the docs, check out:

perldoc perlport

And page through to a section called Newlines...
I guess the real question I have is does Perl on OS X qualify as MacPerl 
or Unix perl ... I defer to the mac os x experts, but would guess Unix perl.

http://danconia.org


Heather Madrone wrote:
I've already encountered a few text file anomalies on OS X. Most GUI 
applications
seem to default to Mac-style text files (linefeeds only), but shell 
programs such as
vi do not handle Mac-style text files gracefully.

Is perl on the Mac going to care whether source files are Mac-style or 
Unix-style?
Is it going to have difficulty reading and operating on either kind of 
file?  What
kind of text files will it write?

Thanks in advance for any illumination.

-hmm
[EMAIL PROTECTED]






Re: hard links on HFS+ (now even further off topic...)

2002-11-18 Thread Wiggins d'Anconia


Ken Williams wrote:


On Monday, November 18, 2002, at 06:13  AM, Wiggins d'Anconia wrote:


Heather Madrone wrote:


Most of my career was spent as a C/C++ systems programmer.
The damage I can do with a command line as root is nothing
compared to the damage I can do with a C compiler.



This makes no sense? Compiling as a non-root user can cause more 
damage than a root enabled user?


She's saying that she's writing systems programs, which (when run) can 
cause a great deal of damage if they contain errors or malicious code.


But then are we to assume that the programs are getting written in the 
production environment, and put into place for execution without testing 
or code audits?  again the discussion was about running as a privileged 
user for every day activities (granted we are way off the original topic 
which didn't start out as this, but that is where it had been taken), 
and naturally a program can cause damage when run in a privileged 
manner, but that damage should be prevented several phases before being 
put into a place where damage can be caused.

http://danconia.org



Re: hard links on HFS+ (now even further off topic...)

2002-11-17 Thread Wiggins d'Anconia


Heather Madrone wrote:

At 03:29 PM 11/17/2002 -0500, William H. Magill wrote:


We're saying much of the same thing, however, this problem which you 
describe is not an  OS or vendor level problem and not even an ACL problem. 
It's a programmer/admin attitude problem, exemplified by the constant stream 
of questions asking how to login as root under OS X, or why they can't su to 
root anymore. It's a basic mentality; a way of thinking about the problem - 
issue.


I'm dyed with that mentality from head to toe; I like having a
root password in my pocket.  On personal systems, I always use
an account with full administrator privileges.  It seems silly
to have one account for me as a human being and another for me
as God.



You've never typed a wrong command?


In a corporate environment, I can certainly understand wanting
layers of protection, but, in many cases, the layers of
protection seem much more complicated than they need to be.
You can waste a lot of time if you have to wait for someone
with the right password to move a file or install a printer.
People, being people, almost invariably configure systems
like clearcase so that they are more trouble than they are
worth.



I think this is where the real distinction comes in, if we are talking 
about corporate environments that are a 30 person single office company 
that uses a 386 linux system to run their laser printer that is one 
thing, how many people could actually have access anyways, but it is 
entirely different in a large company that is more focused on security 
and has to be, or in a college environment where you may have ten 
thousand or more students on a system, 10% of which are trying to see if 
they can crack root.



The concepts of distributed authority are simply foreign to the Unix (and 
Linux) community. And the problems are acerbated by the fact that the 
traditional Unix System Administrator still expects to do everything as 
root. The vendors are just responding to customer demand -- or more 
accurately, the lack thereof -- for security features. Tru64 Unix (aka OSF/1 
aka Digital Unix) has supported a C2 environment out-of-the-box since it's 
first release back in about 1990. But is it used? No. The few who wanted 
enhanced security only wanted a shadow password file, because that's all 
that BSD and Sun offered. They were not interested in taking the time to 
learn the ins and outs of C2 because we don't need that level of security.


Well, do they?  Are the reduced risks worth the increased
administrative costs?


Depends on what is at stake, again, if it is a printer won't be used for 
a couple hours who cares, if it is several billion dollars of transfers 
won't happen for a day or two then it is a real problem and it is worth 
the extra admin costs to know that the only people dorking with your 
systems *should* know what they are doing, and even they may make 
mistakes occasionally.


I worked in hard and soft crash recovery systems for years.  My job
was to be able to get database systems back online fast if someone
ran a forklift through the machine room.  I spent my time devising
systems that wouldn't crash, and, when they did crash, would come
back up quickly without losing a scrap of data.

Aside from enterprise-critical database operations, most installations
didn't care.  If their disks crashed, they could hire a bank of
secretaries to type their data back in.  


This is the wrong logic to use. Why would anyone use a computer at all, 
I mean why talk on the phone when someone could just meet in person? 
Why use a database when you could just hire a million secretaries to 
remember 10 phone numbers.

I can't imagine many Mac installations that justify the sorts of
protections you're suggesting.  Protect the servers, sure, but
don't wall the users off from their own systems so they have to
call ops in every time they insert a CD.



Not now, but in the future...Apple is trying to enter this space, and 
like I mentioned earlier the higher education space.

Personally I keep my account with sudo shell access so that when I need 
to do something as root it is a conscious effort. And I will admit it 
has still come back to bite me on occasion.

Which brings me to a new point in the discussion, I am surprised no one 
has mentioned sudo, I like it as a method of control, that is control 
what a user can do rather than what files they can and can't read/write. 
Obviously this requires knowledge about the relationships between the 
files and the applications, and allows for a different kind of access.

But then again I am biased I come from the linux side of things rather 
than windoze or classic mac.

http://danconia.org



Re: hard links on HFS+ (now even further off topic...)

2002-11-17 Thread Wiggins d'Anconia
Heather Madrone wrote:

At 10:38 AM 11/17/2002 -0500, Wiggins d'Anconia wrote:


Heather Madrone wrote:


At 03:29 PM 11/17/2002 -0500, William H. Magill wrote:


We're saying much of the same thing, however, this problem which you describe is not an  OS or vendor level problem and not even an ACL problem. It's a programmer/admin attitude problem, exemplified by the constant stream of questions asking how to login as root under OS X, or why they can't su to root anymore. It's a basic mentality; a way of thinking about the problem - issue.


I'm dyed with that mentality from head to toe; I like having a
root password in my pocket.  On personal systems, I always use
an account with full administrator privileges.  It seems silly
to have one account for me as a human being and another for me
as God.


You've never typed a wrong command?



Most of my career was spent as a C/C++ systems programmer.
The damage I can do with a command line as root is nothing
compared to the damage I can do with a C compiler.



This makes no sense? Compiling as a non-root user can cause more damage 
than a root enabled user?

I'm careful, and I install safety nets when I need them.  I'm fanatical
about backups.  There isn't anything I could do to my Mac on the command
line that would cause any permanent harm. 


Safety nets? Sounds like running in a non-privileged environment.  How 
often do you run backups, every ten seconds? how about ten minutes? 1 
hour? 8 hours?  My place of business easily crawls through $300,000 
worth of transactions in an 8 hour period, and there are networks doing 
a whole lot more than we are, backups are great for yesterday's data, 
but demanding clients seem to want all of their data, I know they are 
rather pesky that way ;-).

Permissions are not much of a hedge against sloppiness.  If you're
careless, then how much difference is it going to make if you have
to log into an administrative account before you start typing commands?


This is true. But it is the re-enforcement of having to enter a password 
every 5 minutes or so or even each time you log in to do those specific 
tasks that you are working in such a way that you can do damage.



I worked in hard and soft crash recovery systems for years.  My job
was to be able to get database systems back online fast if someone
ran a forklift through the machine room.  I spent my time devising
systems that wouldn't crash, and, when they did crash, would come
back up quickly without losing a scrap of data.
Aside from enterprise-critical database operations, most installations
didn't care.  If their disks crashed, they could hire a bank of
secretaries to type their data back in.  

This is the wrong logic to use. Why would anyone use a computer at all, I 
mean why talk on the phone when someone could just meet in person? Why use a 
database when you could just hire a million secretaries to remember 10 phone 
numbers.


It's not my logic.  I had a platter on my wall with most of the
oxide scraped off from a head crash with a little sign that said
Your data was here.  It was the logic of the people making the
budget decisions for large telecommunications firms.

I spent a lot of time evangelically promoting backups and mirrored
safe disks and whatnot.  A lot of installations only did backups
sporadically and played head crash roulette.  Almost all of them
won.



I can't imagine many Mac installations that justify the sorts of
protections you're suggesting.  Protect the servers, sure, but
don't wall the users off from their own systems so they have to
call ops in every time they insert a CD.


Not now, but in the future...Apple is trying to enter this space, and like I mentioned earlier the higher education space.

Personally I keep my account with sudo shell access so that when I need to 
do something as root it is a conscious effort. And I will admit it has still 
come back to bite me on occasion.


I haven't done any appreciable damage to a system since 1983,
when I accidentally formatted a hard drive (criminally easy on
DOS).

Come to think of it, it was criminally easy to do serious damage to
every computer I worked on before 1995 or so.  And programmers have
fought every change to make systems more secure as long as I can
remember.  The true hacker (in the old sense) doesn't want anything
between himself and the hardware.



A true hacker wants his job to be as hassle free while getting the most 
done, lest we would all still be writing assembler. C, Java, and most 
certainly Perl would have never come to be.  But getting the job done 
means the proper methodologies must be put in place, aka security, 
backups as you stated, other such processes.  I am most certainly glad 
the programmers you speak of don't have access to my machines.  Security 
to a programmer should rank right up there with efficiency, it is the 
only way to predict a sane environment for our concoctions.


The challenge, I think, is to design security systems that are
as simple

Re: hard links on HFS+

2002-11-16 Thread Wiggins d'Anconia
I suppose that is better than one word. RTFM ;-)

http://danconia.org

Lou Moran wrote:

Please understand this is no flame... but I got two words for you:

Goo Gle

Look it up.


On Saturday, Nov 16, 2002, at 23:17 America/New_York, Joseph Kruskal wrote:


On 11/1/02 3:47 AM, William H. Magill at [EMAIL PROTECTED] wrote:


... journaled file system ...


What is a journaled file system?


... the user level -- REAL ACLs being one of particular interest ...


What are ACLs?
What are REAL ACLs?


... especially for C2 type enterprise applications...


What are C2 type enterprise applications?

Thanks, Joe
--
Joseph B Kruskal  [EMAIL PROTECTED]







--
Lou Moran
[EMAIL PROTECTED]
http://ellem.dyn.dhs.org:5281/resume/







RE: deprecated?

2002-11-15 Thread wiggins
deprecated is a general word for any programming language, OS, etc. that indicates 
that a particular functionality is no longer suggested for new use but that it will 
still work for indeterminate amount of time, and that at some point in the future it 
will no longer work.

In the case of arrays, the one time I have seen it is when using if(defined(@array)) 
which is the same as just saying if(@array) and taking the scalar as true or false. 
This is deprecated because the second method is preferable anyways, but it still works 
everywhere that warnings or strict is not on.

To which particular instance were your referring?

http://danconia.org



On Fri, 15 Nov 2002 14:45:00 -0500, Deshazer, Earl (GEAE) [EMAIL PROTECTED] 
wrote:

 what does the word deprecated mean as it relates to an array or hash. I have
 gotten this error before and I don't know how to correct it. Thanks.
 
 William DeShazer