Re: handling undef better

2005-12-17 Thread Larry Wall
On Sat, Dec 17, 2005 at 12:12:15PM -0800, Ashley Winters wrote:
: Explicitly nil values wouldn't warn or fail on activities which undef
: does. nil is basically a value which is simultaneously '' and 0 and
: 0.0 and false *and* defined, and allows itself to be coerced with the
: same rules as perl5 undef, but without an 'uninitialized' warning.

Hmm, the '' value is already simultaneously '' and 0 and 0.0 and false
*and* defined, and allows itself to be coerced with the same rules as
perl5 undef, but without an 'uninitialized' warning.  I don't really
think we need another one.

And replying to the thread in general, I'm not in favor of stricter
default rules on undef, because I want to preserve the fail-soft
aspects of Perl 5.  And that is better served by promotion of
undef to "" or 0 with warnings than it is by letting one undef rot
all your inputs.  When they put multiple actuators on an airplane
control surface, they do so with the assumption that some subset of
the actuators might freeze in some position or other.  It would be
ridiculous in such a situation to throw away the authority of the
other actuators even though "tainted" by the undefined values of some
of the actuators.  That's failsoft behavior.

Strict typing is all very well at compile time when you can do
something about it, but you do not want your rocket control software
throwing unexpected exceptions just because one of your engine
temperature sensors went haywire.  That's a good way to lose a rocket.

All that being said, it's really good to know when and if your
reliablility is suffering.  But you don't just give up in disgust, and
that's what NULL propagation is essentially doing.  Real engineering
adds knowns to unknowns and comes up with a good guess about how much
redundancy to build into the system to compensate.  I don't want
Perl 6 to be *brittle* at run time.

I do think that if you want brittle undefs, it'll be easy to enforce
by catching warning exceptions and promoting them to fatal exceptions.
But remind me not to ride on your airplane.

Larry


Re: handling undef better

2005-12-17 Thread Uri Guttman
> "DD" == Darren Duncan <[EMAIL PROTECTED]> writes:

  DD> At 9:30 AM + 12/17/05, Luke Palmer wrote:
  >> 
  >> You're actually saying that undef either compares less than or greater
  >> than all other objects, which contradicts your earlier point.  I'd say
  >> it just fails.

  DD> At the time I wrote this, I had been thinking that having a list of
  DD> array values where some were undefined was still not unreasonable to
  DD> be sorted.  And in that case, since undef's can't sort by normal means
  DD> (value comparisons don't work on them), we have to do something with
  DD> them so the sorted array has all the elements of the original, hence
  DD> group them at one end.

  DD> However, perhaps it does make better sense for wider consistency that
  DD> a sort needs to have an explicit handler that says what to do with
  DD> undefs, or otherwise the sort fails.

sorting in p6 is not at all like in p5. instead of coding up an explicit
comparison code block and duplicating all the key access code (for $a
and $b), you will specify how to extract/generate each key for a given
record. this new syntax was posted by damian (who else) and it is very
similar to the api in my p5 module sort::maker (we did discuss this
api). i don't know if any A/E/S doc covers it but it is definitely in
the archives. 

so you could easily handle undefs by converting them to the sort value
you want. using // it would be trivial to do (assume an array ref record
is passed in $_ and the key is the second element). these are code
blocks to extract and generate a key.

{ $_->[1] } # sort undef as 0 with lotsa warnings
{ $_->[1] // 0 }# sort undef as 0 with no warnings
{ $_->[1] // -Inf } # sort undef to bottom
{ $_->[1] // Inf }  # sort undef to top

damian take note!

i dunno if +/- Inf are available/useable for sorting. it would be a
useful feature regardless of how undef behaves. sometimes you need to
force certain values to the top or bottom of sorts and this would make
it very easy to do.

so again, i am on the side of leaving undef's default behavior alone and
using a stricture to get your desirec behavior.

uri

-- 
Uri Guttman  --  [EMAIL PROTECTED]   http://www.stemsystems.com
--Perl Consulting, Stem Development, Systems Architecture, Design and Coding-
Search or Offer Perl Jobs    http://jobs.perl.org


handling undef - second draft

2005-12-17 Thread Darren Duncan
Considering all the feedback and discussion I've seen so far, I 
hereby retract my earlier proposals in the 'handling undef better' 
messages and offer new ones instead, which hopefully address the 
issues you collectively have raised.


At the root of the issues I see here is that the meaning of 'undef' 
is overloaded, and perhaps should be split into multiple concepts 
that are addressed with different keywords or class names.


Here are some concepts that undef either has been in practice or was 
previously proposed to be used for:


1. The state that a new container has after it has been declared (or 
allocated) but before any value has been assigned to it, such as with 
"my $x;".


That concept is probably the most befitting of terminology like 
'undef', as those words sound like verbs which say what was not 
*done* with a container.  Aliases for the same concept would be 
'unset' or 'unassigned'.


2. A flag that says "we know that some value is supposed to go here, 
but we don't know what that value is yet, and we're holding a place 
for it".  This flag would normally be used in the place of an actual 
value in a value expression.


That concept is potentially in line with what SQL's NULL means.

3. A flag that says we know that some operation failed, such as would 
be exploited in the " err " 
situations.


This concept is like an exception which isn't thrown but returned.

4. A flag that says "we will become the context-derived 'none' value 
as needed, since we weren't yet told to be something else".


This concept is like what programmers expect when they simply use 
"empty" variables in expressions and expect them to DWIM, like become 
0 or '' or false.


So, in an effort to have different meanings look different, I put 
forward these suggestions:


1. I accept the proposal that we just make another class that 
implements the SQL concept of a null value, perhaps named Null or 
SQL::Null, rather than having this behaviour in the core language, so 
that should simplify the rest of the discussion.  If someone wants 
undefs to propagate through expressions like SQL's NULLs do, rather 
than failing or defaulting, then this can be done with the new class. 
A Null object would be defined but false.  It would overload standard 
operators so that most expressions involving it would propagate a 
Null object, or compare unequally as desired.  Therefore, this sort 
of behaviour will be isolated and standard data types won't behave 
this way by default.


2. Modify the 'Exceptions' section of S04 so that built-in functions 
return a 'error' or 'failure' or 'exception' if they fail, instead of 
an "interesting value of undef".  The behaviour and handling of or 
response to these is essentially unchanged, but the value is called 
something more distinctive and it is not called 'undef'.  Instead of 
testing for .defined(), invoke a different method like .failed() or 
.error() instead; invoking .defined() on an error should perhaps 
return true instead of false.  Perhaps update err() to activate on 
the error rather than or in addition to undef.


3. Have the functionality of 'use fatal' turned on by default in 
large programs, though not one-liners, and it can be turned off 
otherwise.  It is safer to have them brought to your attention where 
you make a conscious effort to respond to or dismiss them.


4. An expression or statement having an 'err' would impose a 
try-block around the expression it is attached to, so the right thing 
still happens when errors are thrown by default.  And 'err' is a 
conscious effort to deal.


5. Autovivification in larger programs should not happen by default, 
unless you have something like 'use autovivify;'.  But with 
one-liners it would work by default.


6. Attempts to use undef where a defined value is expected, such as 
wherever they currently generate warnings, should be upgraded to a 
stricture and fail by default in larger programs.  But like with 
other strictures, this would be turned off by default in one-liners. 
If the DWIM behaviour is wanted in larger programs, one can say 'no 
strict undefs;' or such.


There may be remaining holes in my suggestions, but hopefully I dealt 
with the worst ones from the preceeding first draft.


-- Darren Duncan


Re: Change 26165 broke ext/threads/t/stress_re.t test on Win32 (and patch to t/test.pl and/or Test::Harness)

2005-12-17 Thread demerphq
On 12/17/05, chromatic <[EMAIL PROTECTED]> wrote:
> On Saturday 17 December 2005 08:23, demerphq wrote:
>
> > It seemed to me that
> > a better patch would be to change the way harness handles directives
> > so it recognizes TODO & SKIP as being a valid directive.
>
> What would that mean?  SKIP tests don't run.  TODO tests do.
>
> If the test doesn't run, I think it's a SKIP and nothing else.

Well, Test::Harness aka the opposite point of view and considers it a
TODO and not a SKIP at all as the TODO is first.

And Test::More::todo_skip() outputs "not ok" for the test, but
test.pl::todo_skip() emits "ok" for the test, which Test::Harness
treats as an unexpected success. If Test::Harness realized that TODO &
SKIP meant both then it could differentiate between the case of
todo_skip() and the case of a real todo that unexpectedly succeeds or
fails. After all it needs to support scenarios where the output isnt
being manufactured by either Test::Builder or test.pl, so you can't
just rely on fixing either. (IMO anyway)

BTW, none of this is my logic, its legacy code. If you want to
question whether it makes sense to have something be both TODO and
SKIP at the same time ask the person that wrote todo_skip() in the
first place.

:-)

cheers,
yves


--
perl -Mre=debug -e "/just|another|perl|hacker/"


load_bytecode now insists on ASCII.

2005-12-17 Thread Bob Rogers
   This looks like consequence of r10458, is still present in r10568,
and is easy to reproduce:

[EMAIL PROTECTED]> cat load-test.pir
.sub _main :main
.local string file_name
file_name = iso-8859-1:"foo.pbc"
load_bytecode file_name
.end
[EMAIL PROTECTED]> ./parrot load-test.pir
Cross-charset index not supported
[EMAIL PROTECTED]> 

It's easy enough for me to work around, because I don't need ISO-8859-1
file names, but shouldn't load_bytecode be a bit more liberal?  I could
probably hack something together along those lines, but it might be
better have a solution that takes account of what charset(s) the OS
would insist upon for file names, true?

   The other mystery is that I still don't know why I'm generating
ISO-8859-1 strings in the first place.  Could it be picking this up from
the source file of the code that builds the string?  If so, how?  And
why -- the code looks like plain ASCII to me (and to emacs).

   TIA,

-- Bob Rogers
   http://rgrjr.dyndns.org/


Re: Change 26165 broke ext/threads/t/stress_re.t test on Win32 (and patch to t/test.pl and/or Test::Harness)

2005-12-17 Thread chromatic
On Saturday 17 December 2005 08:23, demerphq wrote:

> It seemed to me that
> a better patch would be to change the way harness handles directives
> so it recognizes TODO & SKIP as being a valid directive.

What would that mean?  SKIP tests don't run.  TODO tests do.

If the test doesn't run, I think it's a SKIP and nothing else.

-- c


Re: Three more shoot outs

2005-12-17 Thread Leopold Toetsch


On Dec 17, 2005, at 18:48, Joshua Isom wrote:
 If I make my version and the perl version print out the final 
sequence, they're identical.


Ah. Sorry. I missed that it replaces from the end.

leo



Re: handling undef better

2005-12-17 Thread chromatic
On Friday 16 December 2005 22:25, Darren Duncan wrote:

> At 10:07 PM -0800 12/16/05, chromatic wrote:

> >This is fairly well at odds with the principle that users shouldn't have
> > to bear the burden of static typing if they don't want it.

> This matter is unrelated to static typing.  The state of whether a
> variable is defined or not is orthoganal to its container type.

I didn't say container typing.  As I see it, your concern is what happens when 
trying to *coerce* something containing the undefined value.

> But more to the point, if you assign your default values at strategic
> places, you are not writing very much extra code at all.

Objection: "not very much" extra code is asymptotically greater than no extra 
code.

A change this great from Perl 5 seems like it ought to provide a whole heap of 
benefit to make up for the whole big heap of inconvenience everyone now has 
to live with.  So far, I'm not even seeing a little heap of benefit.  

Mathematical-theoretic purity is a nice idea, but I'm usually too busy trying 
to do actual work to appreciate anything beyond "hey, can I write robust, 
maintainable working code without too much effort and time?"

> Those few characters are nothing considering the amount of hassle
> they can save.

I didn't buy that argument from the "static manifest typing everywhere" folks 
either.

What happens if you have a sparse array with the greatest index of 10 and you 
want to assign something with an index of 100?  Do you have to give the array 
an explicit default value?  What if you create it in a module somewhere?  
What if it's a generic array and you don't know when you create it what type 
of default value it should contain?  What if 0 is a valid value that means 
something entirely different from "default initialized but unassigned"?

All of a sudden, am I going to have to care about the default value of every 
container I create or receive from somewhere, just in case its notion of 
truth and definedness doesn't match mine?

If so, how inconvenient is the code?

If not, why not?

-- c


Re: handling undef better

2005-12-17 Thread Ashley Winters
On 12/17/05, Sebastian <[EMAIL PROTECTED]> wrote:
>
> Obviously there are mixed opinions of how undef should be treated and
> some won't be happy with what becomes final, so implementing some
> intelligent defaults and simple pragmas, but not excluding the ability
> to *really* control your undefs, sounds like a win-win.

If we want to have our cake and eat it too, let's create a new value: nil.

my $dog = nil;
ok !$dog;
ok $dog eq '';
ok $dog ne 'foo';
ok $dog == 0;
ok $dog != any(1..100);
ok $dog === nil;
ok ++$dog == 1;
ok not $dog === nil;

Explicitly nil values wouldn't warn or fail on activities which undef
does. nil is basically a value which is simultaneously '' and 0 and
0.0 and false *and* defined, and allows itself to be coerced with the
same rules as perl5 undef, but without an 'uninitialized' warning.

Uninitialized variables would remain undef, and would have Larry's
unthrown-exception failure rule. The nil value is completely defined
as above. And, huffmanly speaking, most people who want to explicitly
initialize a variable to an empty state are going to want nil, not
undef. C is redundant, after all -- as is C<$var //
undef>. When used as a value, undef returns C or something. Thus completes the circle of
definedness.

Ashley Winters


[perl #37969] make svnclobber is not working properly

2005-12-17 Thread via RT
# New Ticket Created by  Alberto Simoes 
# Please include the string:  [perl #37969]
# in the subject line of all future correspondence about this issue. 
# https://rt.perl.org/rt3/Ticket/Display.html?id=37969 >


Basically, done:

   $ svn up
   At revision 10568.

   $ make svnclobber
   perl "-MExtUtils::Manifest=filecheck" -le 'sub
   ExtUtils::Manifest::_maniskip{sub{0}};$ExtUtils::Manifest::Quiet=1; do
   { unlink $_ unless $_ =~ m!(?:\.svn)! } for filecheck()'

   $ svn up
   Restored 'debian/parrot-doc.install'
   Restored 'debian/libparrot-dev.install'
   Restored 'debian/compat'
   Restored 'debian/libparrot.install'
   Restored 'debian/control.in'
   Restored 'debian/changelog'
   Restored 'debian/libparrot-dev.lintian-overrides'
   Restored 'debian/copyright'
   Restored 'debian/rules'
   Restored 'debian/parrot.install'
   Restored 'debian/parrot.docs'
   At revision 10568.



-- 
Alberto Simões - Departamento de Informática - Universidade do Minho
  Campus de Gualtar - 4710-057 Braga - Portugal


Re: [perl #37969] make svnclobber is not working properly

2005-12-17 Thread Joshua Hoblitt
Perhaps the svnclobber target should just invoke `svn revert -R .`?

-J

--
On Sat, Dec 17, 2005 at 08:09:45AM -0800, Alberto Simoes wrote:
> # New Ticket Created by  Alberto Simoes 
> # Please include the string:  [perl #37969]
> # in the subject line of all future correspondence about this issue. 
> # https://rt.perl.org/rt3/Ticket/Display.html?id=37969 >
> 
> 
> Basically, done:
> 
>$ svn up
>At revision 10568.
> 
>$ make svnclobber
>perl "-MExtUtils::Manifest=filecheck" -le 'sub
>ExtUtils::Manifest::_maniskip{sub{0}};$ExtUtils::Manifest::Quiet=1; do
>{ unlink $_ unless $_ =~ m!(?:\.svn)! } for filecheck()'
> 
>$ svn up
>Restored 'debian/parrot-doc.install'
>Restored 'debian/libparrot-dev.install'
>Restored 'debian/compat'
>Restored 'debian/libparrot.install'
>Restored 'debian/control.in'
>Restored 'debian/changelog'
>Restored 'debian/libparrot-dev.lintian-overrides'
>Restored 'debian/copyright'
>Restored 'debian/rules'
>Restored 'debian/parrot.install'
>Restored 'debian/parrot.docs'
>At revision 10568.
> 
> 
> 
> -- 
> Alberto Sim??es - Departamento de Inform??tica - Universidade do Minho
>   Campus de Gualtar - 4710-057 Braga - Portugal


pgpSql3QIRyFo.pgp
Description: PGP signature


Re: handling undef better

2005-12-17 Thread Sebastian
I still think it'd be neat to have a special Undef class of some sort
which can be subclassed and further defined to really DWIM rather than
be stuck with whatever pragmas Perl has graciously built in. Something
like this would require more thinking and speculation -- and it may
hurt performance too much to be practical, though.

Obviously there are mixed opinions of how undef should be treated and
some won't be happy with what becomes final, so implementing some
intelligent defaults and simple pragmas, but not excluding the ability
to *really* control your undefs, sounds like a win-win.

- sebastian

On 12/17/05, Uri Guttman <[EMAIL PROTECTED]> wrote:
> > "LP" == Luke Palmer <[EMAIL PROTECTED]> writes:
>
>   LP> Actually, you can think of undef pretty much as defining
>   LP> autovivification.  "If you use it as a number, it becomes a number; if
>   LP> you use it as a string, it becomes a string; if you use it as a hash,
>   LP> it becomes a hash; ..."
>
>   LP> However, that's not really accurate, because:
>
>   LP> # perl 5
>   LP> my $x;
>   LP> $x->{4} = 1;
>   LP> print $x;   # "HASH(...)"
>
>   LP> my $x;
>   LP> my $y = $x + 1;
>   LP> print $x;   # not "0"
>
> those aren't the same either. in p5 only undef when used as a ref gets
> autovivified to the appopriate anon ref. undef when used as a regular
> scalar value stays undef. the deref thing was created to handle
> assigning to multilevel structures without needing to explicitly set
> each of the upper levels (think about how much extra code this one
> little feature has saved us all!). since in p5 undef coerces to 0 or ''
> as needed (wherever the undef came from), it doesn't change the value of
> the undef.
>
> and i agree with luke that the idea is interesting but it should be a
> stricture. it is not a good idea for default as it ruins
> autovivification. also it would ruin many one liners and short scripts
> which don't even use regular strict. perl's ability to dwim undef and
> not carp or croak is a good default. just use pragmas to make it
> stricter in larger programs.
>
> uri
>
> --
> Uri Guttman  --  [EMAIL PROTECTED]   http://www.stemsystems.com
> --Perl Consulting, Stem Development, Systems Architecture, Design and Coding-
> Search or Offer Perl Jobs    http://jobs.perl.org
>


Re: Change 26165 broke ext/threads/t/stress_re.t test on Win32 (and patch to t/test.pl and/or Test::Harness)

2005-12-17 Thread demerphq
On 12/16/05, Steve Hay <[EMAIL PROTECTED]> wrote:
> The real bummer, though, is that I'm now away until Jan 3rd and I'm
> switching my machine off now, so you can't see the fruits of your
> efforts in my overnight smokes until next year :-(

If its any help to you guys I built and tested just now on Win32
(VC7+Win2k) and it passes all tests + 2 todo tests.  This is at
patchlevel 26386

I get warnings on sv.c (i mention this because i saw that sv.c was
changed between 26384 and 26386 during the time i was messing about
with this):

sv.c
..\sv.c(812) : warning C4307: '+' : integral constant overflow
..\sv.c(813) : warning C4146: unary minus operator applied to unsigned
type, result still unsigned
..\sv.c(818) : warning C4307: '+' : integral constant overflow
..\sv.c(819) : warning C4146: unary minus operator applied to unsigned
type, result still unsigned
..\sv.c(834) : warning C4307: '+' : integral constant overflow
..\sv.c(835) : warning C4146: unary minus operator applied to unsigned
type, result still unsigned
..\sv.c(840) : warning C4307: '+' : integral constant overflow
..\sv.c(841) : warning C4146: unary minus operator applied to unsigned
type, result still unsigned

The todo tests are as follows:

op/local.ok
1/85 unexpectedly succeeded
op/pat...ok
1/1195 unexpectedly succeeded

The op/pat one seems to suggest that perl #37038 is resolved. (Patch
attached to de-TODO this test).

ok 1195 - # TODO assigning to original string should not corrupt match vars
ok
1/1195 unexpectedly succeeded


The op/local one seems to actually indicate a failure in the logic of
test.pl and harness. todo_skip from the harness/builder framework
emits 'not ok' for todo_skip, but test.pl emits 'ok', so the harness
logic assumes that this is an unexpected success when it is in fact
just a skip. Patch is attached.

BTW, i dont know if this is the right fix really. It seemed to me that
a better patch would be to change the way harness handles directives
so it recognizes TODO & SKIP as being a valid directive. Currently the
logic is that the first non space sequence following a hash mark is
considered the tests directive. So for todo tests its # TODO. I was
thinking that maybe this logic could be changed slightly so that
directives can be listed. Then harness could tell # TODO & SKIP was
both a todo and a skip, and not just read it as a todo.

I've attached a patched Test::Harness::Straps and Test::Harness::Point
that supports multiple directives, with the presumption that they will
be supplied in an & seperated list, that way a test can be TODO and
SKIP at the same time.

Anyway, with either patch applied the misleading unexpected success
message goes away.

With all three patches applied, all tests pass with no unexpected successes.

Yves

--
perl -Mre=debug -e "/just|another|perl|hacker/"
diff -wur -F'^sub' sync/lib/Test/Harness/Point.pm sync_patched/lib/Test/Harness/Point.pm
--- sync/lib/Test/Harness/Point.pm	2005-04-24 17:53:28.0 +0200
+++ sync_patched/lib/Test/Harness/Point.pm	2005-12-17 15:18:05.859125000 +0100
@@ -112,15 +112,19 @@ sub set_directive   {
 $directive =~ s/\s+$//;
 $self->{directive} = $directive;

-my ($type,$reason) = ($directive =~ /^\s*(\S+)(?:\s+(.*))?$/);
+my ($type,$reason) = ($directive =~ /^\s*([^&\s]+(?:\s*&\s*[^\&\s]+)*)(?:\s+(.*))?$/);
 $self->set_directive_type( $type );
 $reason = "" unless defined $reason;
 $self->{directive_reason} = $reason;
 }
 sub set_directive_type {
 my $self = shift;
-$self->{directive_type} = lc shift;
-$self->{type} = $self->{directive_type}; # History
+my $type = lc shift;
+my %type = map { $_ => 1 } split /\s*&\s*/, $type;
+$type = join " & ",sort keys %type;
+$self->{directive_types} = \%type;
+$self->{directive_type} = $type;
+$self->{type} = $type; # History
 }
 sub set_directive_reason {
 my $self = shift;
@@ -130,15 +134,18 @@ sub directive_type  { my $self = shift;
 sub type{ my $self = shift; $self->{directive_type} }
 sub directive_reason{ my $self = shift; $self->{directive_reason} }
 sub reason  { my $self = shift; $self->{directive_reason} }
+sub is_directive_type {
+my $self = shift;
+my $type = lc shift;
+return $self->{directive_types}{$type};
+}
 sub is_todo {
 my $self = shift;
-my $type = $self->directive_type;
-return $type && ( $type eq 'todo' );
+return $self->is_directive_type('todo');
 }
 sub is_skip {
 my $self = shift;
-my $type = $self->directive_type;
-return $type && ( $type eq 'skip' );
+my $type = $self->is_directive_type('skip');
 }

 sub diagnostics {
diff -wur -F'^sub' sync/lib/Test/Harness/Straps.pm sync_patched/lib/Test/Harness/Straps.pm
--- sync/lib/Test/Harness/Straps.pm	2005-10-10 15:46:04.0 +0200
+++ sync_patched/lib/Test/Harness/Straps.pm	2005-12-17 15:

Re: Test::Harness spitting an error

2005-12-17 Thread Ian Langworth
No one else has replied, so here's a shot in the dark: Try setting the
PERLIO environment variable to "crlf" (without quotes).

--
Ian Langworth


Re: Three more shoot outs

2005-12-17 Thread Joshua Isom
Not that I can tell from the code...  Starting from the beginning, push 
the substr location onto the end, then in another loop, pop off that 
ending and use it for replacement.  Only the locations at the end are 
affected which is why it starts at the end.  An iterator is used to 
provide a wrapper around each iub key.  If I make my version and the 
perl version print out the final sequence, they're identical.


On Dec 17, 2005, at 9:47 AM, Leopold Toetsch wrote:


Joshua Isom wrote:
  Commented
out is code to use capturing regex to do it for the final 
substitution.  PGE seems faster with the coroutine.


Doesn't it now substitute on wrong positions after the first 
replacement?


leo






Re: handling undef better

2005-12-17 Thread Uri Guttman
> "LP" == Luke Palmer <[EMAIL PROTECTED]> writes:

  LP> Actually, you can think of undef pretty much as defining
  LP> autovivification.  "If you use it as a number, it becomes a number; if
  LP> you use it as a string, it becomes a string; if you use it as a hash,
  LP> it becomes a hash; ..."

  LP> However, that's not really accurate, because:

  LP> # perl 5
  LP> my $x;
  LP> $x->{4} = 1;
  LP> print $x;   # "HASH(...)"

  LP> my $x;
  LP> my $y = $x + 1;
  LP> print $x;   # not "0"

those aren't the same either. in p5 only undef when used as a ref gets
autovivified to the appopriate anon ref. undef when used as a regular
scalar value stays undef. the deref thing was created to handle
assigning to multilevel structures without needing to explicitly set
each of the upper levels (think about how much extra code this one
little feature has saved us all!). since in p5 undef coerces to 0 or ''
as needed (wherever the undef came from), it doesn't change the value of
the undef.

and i agree with luke that the idea is interesting but it should be a
stricture. it is not a good idea for default as it ruins
autovivification. also it would ruin many one liners and short scripts
which don't even use regular strict. perl's ability to dwim undef and
not carp or croak is a good default. just use pragmas to make it
stricter in larger programs.

uri

-- 
Uri Guttman  --  [EMAIL PROTECTED]   http://www.stemsystems.com
--Perl Consulting, Stem Development, Systems Architecture, Design and Coding-
Search or Offer Perl Jobs    http://jobs.perl.org


Re: Three more shoot outs

2005-12-17 Thread Leopold Toetsch

Joshua Isom wrote:
  Commented
out is code to use capturing regex to do it for the final substitution. 
 PGE seems faster with the coroutine.


Doesn't it now substitute on wrong positions after the first replacement?

leo





Re: handling undef better

2005-12-17 Thread Rob Kinyon
On 12/17/05, Darren Duncan <[EMAIL PROTECTED]> wrote:
[snip]
> 2. Until a value is put in a container, the container has the
> POTENTIAL to store any value from its domain, so with respect to that
> container, there are as many undefs as there are values in its
> domain; with some container types, this is an infinite number.
>
> Only a container that can have exactly one possible value can be
> equated with; but then you have a constant.
>
> In a manner of speaking, an undef is like a quantum superposition, in
> that it has no specified value, but rather all possible domain values
> at once, so you can not absolutely say it is equal to anything.

So, in essence, you're saying that undef === one( #the domain of the
type# ) ... I'm not sure I'm comfortable with that. If I have an undef
of a constrained type and I compare it to a value of some other
constrained type whose domains don't overlap, then, by this
definition, I -can- say something about the truth value. For example,
if I define EvenInt and OddInt in the obvious ways, then the following
should hold:

my EvenInt $e;
my OddInt $o;

if ( $e != $o ) { say "This should print out." }

I'm not sure that works with the Principle of Least Surprise. While I
cannot say what it is, you're saying that I can now say what it isn't.
While that follows from typing, that doesn't follow from the common
understanding of undef.

Rob


Re: handling undef better

2005-12-17 Thread David Green


On 12/16/05, Darren Duncan wrote:

The root question of the matter is, what does "undef" mean to you?


To me it means nothing.  (I'm so callous.)

The fact is, that in any normal program, using an undefined value as 
if it were a defined one is a bug.  Normally there will be a point 
where such a variable should be tested for definedness and either be 
given a default value explicitly or fail.  Checking your input at 
the gates is good programming practice.


Funny, I feel just the opposite: if a normal[sic] program has to 
define initial values for its variables, then it probably isn't 
designed right.  Sure, sometimes you need to start with your 
$countdown=10, but most of the time I expect my strings to be empty 
until I put something in them, my booleans to be false until proven 
true, etc.  Uninitialisation warnings are the bane of my existence, 
and I usually start my programs with Cwarnings("uninitialized");> (unless I'm feeling brazen enough to 
forgo warnings altogether, hah!  And once I tried doing the crossword 
in ink, so there!!).


But still, the default action should be that undef never becomes 
anything magically, which aids in avoiding bugs.  Such as when 
you're using a value that you thought you tested but actually didn't.


I can't say it's never bitten me (I make enough mistakes for anything 
to happen), but undef problems certainly don't stick out in my 
memory.  As opposed to SQL's null<>null, which has caught me many 
times in the past, and probably will catch me many more times to come.


Having users explicitly set defaults with //, or by setting a 
defaults method, makes the program more self describing, as you can 
see what it is doing in the code, and it isn't doing anything 
without saying so.


I was mostly serious in saying that programs should be able to assume 
variables start out as nothing.  Having to define every last thing 
actually makes code harder to read and write, like trying to breathe 
consciously, or deliberately move every leg muscle when trying to 
walk.  It's just too low a level.


Having an unknown/danger value, as well as the good ol' undef/none 
value, could be useful though.  But I think you should have to 
deliberately use it ("my $x=unknown;") to show that something 
unnatural is going on.



-David


Re: handling undef better

2005-12-17 Thread Ruud H.G. van Tol
Gordon Henriksen schreef:

> I find it useful to distinguish between unassigned and undefined
> (null).

I am not sure that you need this distinction, but it should be no
problem to have it, just like 'tainted' and 'NaN' and 'zero/empty' and
'error'.


> I find null propagation frustrating; it's more useful that my code
> keep data rather than to throw it away on the theory that "undef
> means maybe, and anything combined in any fashion with maybe results
> in more maybe". I just wind up writing defined(expr)?expr:0 over and
> over to avoid throwing away the other part of the expression.

It should be a lexical mode, so that you can choose when to have it and
when not.
I prefer the "anal retentive 99 + undef -> die" mode for almost
everything.


> An unassigned variable is very different, and is a compile-time
> concept. Static flow control can find accesses of not definitely
> assigned local variables

AFAIK that is not possible in Perl. (eval etc.)

-- 
Grtz, Ruud



Re: handling undef better

2005-12-17 Thread Ruud H.G. van Tol
Darren Duncan schreef:

> A variable whose value is undefined is still a typed container; eg:
>
>   my Int $z; # undef since not assigned to
>   my Str $y; # undef since not assigned to

If 'undef' becomes 'fail' at the same time that those base types don't
have default start-values such as 0 and '' (and the start-status
defined), then most coders will find a way to make Int-s start as
defined and 0 and Str-s as defined and '', and put the line that does
that in the start of every snippet that they create. Which would bring
us back to nothing.

So if 'undef' becomes 'fail', also give base types the start-status
'defined' and a normal start-value like 0 and +0.0 and '' and false. It
should be made easy though to minimize what the constructors of the base
types do, for clean coders (use fatal).

So I almost agree with what Luke said: 'make undefs and exceptions the
same thing, and to do away with "use fatal"', but see "use fatal" as the
switch to disable (or minimize) the base type constructors.


> For all practical purposes, an undef is unuseable random garbage

I don't agree. 'undef' is a status. Status is orthogonal to value.


> The odds of two undefined values in such primitive data types being
> equal is 1/Inf, which is zero.

Why gamble?

-- 
Grtz, Ruud



RE: handling undef better

2005-12-17 Thread Gordon Henriksen
I find it useful to distinguish between unassigned and undefined (null).


"None" is very often a valid value, especially for primitive types, and
especially where databases are involved. i.e., the range of a variable might
be {undef, -2^31..2^31-1}. In my experience:

  99 + undef -> 99 # Permissive. Stable. Useful. [Perl]
  99 + undef -> undef  # Pedantic. Error-prone. Annoying. [SQL, C# 2.0]
  99 + undef -> die# Anal retentive. Crash-prone. Enfuriating.
[Obj-C]
  99 + undef is impossible # Ill-advised. Unusable. [C#, C]

I find null propagation frustrating; it's more useful that my code keep data
rather than to throw it away on the theory that "undef means maybe, and
anything combined in any fashion with maybe results in more maybe". I just
wind up writing defined(expr)?expr:0 over and over to avoid throwing away
the other part of the expression.

The two third and fourth options are just progressively more destructive
forms of the same logic. Succinctly, 'use crash_on_every_use_of_undef' is an
pragma I'd want to opt out of almost globally.


An unassigned variable is very different, and is a compile-time concept.
Static flow control can find accesses of not definitely assigned local
variables, like this:

  my Animal $pet;
  given $kind {
  when 'dog': $dog = new Dog;
  when 'cat': $pet = new Cat;
  when 'none': $pet = undef;
  }
  return $pet;

Static flow control analysis can see that, where $kind not in ('dog', 'cat',
'none'), $pet will not be definitely assigned in the return statement. To
ensure definedness, there must be a default case. Perhaps $pet's
compiler-supplied default value is okay, but the programmer's intent isn't
explicit in the matter. Note that in the case of $kind == 'none', $pet's IS
assigned: It's assigned undef.

While flow control analysis requires some additional work to avoid reliance
on default values, I find that work to be less than the work debugging the
bugs introduced because such checks aren't performed in the first place. It
also allows for very strong guarantees; i.e., "I know this variable cannot
be undefined because I never assign undef to it, and the compiler would tell
me if I accessed it without assigning to it."

This is what 'use strict' should evolve toward, in my mind.

-Original Message-
From: Darren Duncan [mailto:[EMAIL PROTECTED] 
Sent: Sat, Dec 17, 2005 1:26 AM
To: perl6-language@perl.org
Subject: Re: handling undef better

At 10:07 PM -0800 12/16/05, chromatic wrote:
>On Friday 16 December 2005 18:15, Darren Duncan wrote:
>  > 0. An undefined value should never magically change into a defined
>>  value, at least by default.
>
>This is fairly well at odds with the principle that users shouldn't 
>have to bear the burden of static typing if they don't want it.

This matter is unrelated to static typing.  The state of whether a variable
is defined or not is orthoganal to its container type.

>It sounds like you want to override the coercion of undef to fail, at 
>least in a lexical scope.  Go for it.

Yes.  Or have that be the default behaviour.

Just as having variables magically spring into existence on the first
reference to them should not happen by default, and Perl 6 defaults to
having this stricture turned on.

Likewise, it should have the stricture of no undef coersion on by default,
but developers can turn it off locally as they turn off strict locally.

>I can't see it as a default behavior though.  Sure, the literal 
>expression "6
>+ undef" is pretty silly, but I don't really want to write "6 + Maybe
>$variable" (sorry, Haskellers) everywhere when the compiler is 
>perfectly capable of DWIMming in the 98% of cases where $variable is 
>undefined because I like being so lazy as not to initialize explicitly 
>every possible variable I could ever declare, being very careful to 
>distinguish between 0, '', and undef in boolean context.
>
>I suspect the remaining two percent of cases, I won't write "6 + undef"
>either.

I think you're over-stating the frequency of situations where people
actually want that auto-coersion; I'm thinking it is more like 50% at best.

But more to the point, if you assign your default values at strategic
places, you are not writing very much extra code at all.

I find the argument against assigning explicit values to be only slightly
stronger than the argument against using 'my' etc.

Those few characters are nothing considering the amount of hassle they can
save.

-- Darren Duncan



Re: handling undef better

2005-12-17 Thread Darren Duncan

At 9:30 AM + 12/17/05, Luke Palmer wrote:

On 12/17/05, Darren Duncan <[EMAIL PROTECTED]> wrote:
 > Undef, by definition, is different from and non-equal to everything

 else, both any defined value, and other undefs.


You said "by definition", but where is this definition?


Maybe "definition" was the wrong word; I meant "by reasonable 
assumption" partly.


And also, since I consider undef and SQL's null to mean the same 
thing, which is "I don't know what the value is that should go here", 
SQL does explicitly define NULL to be unequal to all other values and 
NULLs, which seems a fine precedent.



 > 2b. As a pseudo-exception, while undef/unknown values are

 conceptually all unequal to each other, they should all sort
 together; eg, calling sort() on an array of values where some are
 defined and some not, should group all the undefs together.  I leave
 it up to discussion as to whether they should sort before or after
 all the defined values, but one of those choices should be picked for
 predictability.


You're actually saying that undef either compares less than or greater
than all other objects, which contradicts your earlier point.  I'd say
it just fails.


At the time I wrote this, I had been thinking that having a list of 
array values where some were undefined was still not unreasonable to 
be sorted.  And in that case, since undef's can't sort by normal 
means (value comparisons don't work on them), we have to do something 
with them so the sorted array has all the elements of the original, 
hence group them at one end.


However, perhaps it does make better sense for wider consistency that 
a sort needs to have an explicit handler that says what to do with 
undefs, or otherwise the sort fails.



 > 5. In any situation where a developer wants an undefined value to

 become a zero or empty string or something else, they should say so
 explicitly, such as with:

   $foo = undef // 0;
   $bar = undef // '';
   $baz = undef // $MY_DEFAULT;

 The fact is, that in any normal program, using an undefined value as
 if it were a defined one is a bug.  Normally there will be a point
 where such a variable should be tested for definedness and either be
 given a default value explicitly or fail.  Checking your input at the
 gates is good programming practice.


But checking input at the gates is also something you'd like to happen
automatically, or declaratively at the very least.  Thus all of Perl
6's type signature nonsense.


Yes, and I'm not proposing we change that.  However, unless Pugs' 
implementation is wrong, the declaritive signitures only check that 
subs get the right number of arguments and that each one is of the 
correct container type.  Whether or not an argument has a ? in the 
declaration only toggles whether a container needs to be passed to 
it, not what the container has in it.  If someone provides an 
undefined value as an argument, that is let through, since it could 
be valid input for some subs.  Those subs for which an undef is 
invalid currently need to either put an explicit "where { .defined }" 
trait on their argument or use a //= etc on it inside the called sub. 
Now maybe this behaviour is wrong, but its what I observed.



And you're also losing a rather important idiom:

my %seen;
my @q = ($initial);
for @q {
next if $seen{$_}++;
@q.push(.next_nodes);
}

You are also losing autovivification, which is one of Perl's staples.

Actually, you can think of undef pretty much as defining
autovivification.  "If you use it as a number, it becomes a number; if
you use it as a string, it becomes a string; if you use it as a hash,
it becomes a hash; ..."

However, that's not really accurate, because:

# perl 5
my $x;
$x->{4} = 1;
print $x;   # "HASH(...)"

my $x;
my $y = $x + 1;
print $x;   # not "0"


Actually, I don't like autovivification either, and wish there was a 
pragma to make attempts to do it a fatal error; it smacks too much of 
using variables that weren't declared with 'my' etc.  I prefer to put 
in the explicit "$seen{$_} //= 0;" above the ++ line, and "$x = 
hash()" in as well, etc.


This behaviour is more consistent with what is expected if, say, you 
have some random other class being used instead of a hash or array, 
which don't support the same autovivification behaviour.  If you 
tried calling "$seen{$_}->inc()", it would die, not turn into a 
counter object and then increment.


I forget if I mentioned autovivification on this list before, but now I have.


While the premise of this proposal is nice, it feels like it's missing
the big picture.  Undef is what subs use when they fail if the caller
is not under "use fatal".  However, people have been requesting this
sort of dwimmery:

open "foo" err die "Hey, I couldn't open it";
open "foo";  # dies if it fails anyway

It would be nice to see a proposal of undef that handles this sort of
thing cleanly (saying "if a sub returns undef at th

imcc, macros, and tcl

2005-12-17 Thread Leopold Toetsch
We were struggling with some memory corruption seen mainly in tcl [1] 
since quite a time.


I think, I've found it now, thanks to an example Matt has pasted this 
morning.


The reason is:
- there is a hard limit of 256 macros
- this was marked with XXX but *not checked*
- each .include 'foo' with macros inside does *append* to the macro list 
and worse:

- even if the macro already exists, it's just appended
- the list of macros is AFAIK never cleared

For now I've put in a check for this limit and throw an 
internal_exception, if necessary (r10568).


Before putting further effort into that macro stuff I'd rather have 
clarified:

- what is the scope of macros
- shouldn't we have an external macro pre-processor

Compiler writers can currently work around by:
.include 'file with .const  = ..'   always
.include 'file with .constant'.pasm   once
.include 'file with .macro'   once

(yes that is a mess)

leo

[1] guess which tcl tests are failing now - yes:

cmd_break.t  1-2
cmd_continue.t   1-2



Re: handling undef better

2005-12-17 Thread Luke Palmer
On 12/17/05, Darren Duncan <[EMAIL PROTECTED]> wrote:
> An undefined value is NOT the same as zero or an empty string
> respectively; the latter two are very specific and defined values,
> just like 7 or 'foo'.

I definitely agree with you in principle.  This is something that has
been bugging me, too.  However, there are some holes in your argument
and proposal, so I'll poke at them.

> Undef, by definition, is different from and non-equal to everything
> else, both any defined value, and other undefs.

You said "by definition", but where is this definition?

> 1. Any expression that expects a defined value as an argument, such
> as typical mathematical or string operations, and gets an undefined
> argument, will as a whole have undef as its value, or it will fail.
> Examples are the expressions "$anything + undef" and "$anything ~
> undef".

Hmm.  Maybe this has some relation to "use fatal".

> 2. Any boolean-returning expression should return undef or false or
> fail if given an undef.

So:

undef == 3  # false
undef != 3  # false!?

undef < 3   # false
undef > 3   # false
undef == 3  # false!?

Failure seems to be the better option here.

However, I don't think that's a good option either.  When you say:

if $x === "hello" {...}

You kinda want that just not to execute, because $x is not the same as "hello".

Your proposal is trying to make undef less magical.  Well, a good way
to do that would be to make it less magical; i.e. simply make it a
singleton value (it has class Undef, and the only thing that can be of
class Undef is undef).  Therefore:

3 + undef   # fails because infix:<+>(Int, Undef) isn't defined

> 2a. At the very least, "undef  undef" should NEVER
> return true, because an unknown quantity can not be claimed to be
> equal to an unknown quantity.  Rather, the defined() method, which is
> analagous to 'IS NOT NULL', and such things are the proper way to
> test if a variable is unknown.

  undef == 3 # fails (undef is not a number, so it can't be compared to one)
  undef eq "foo" # fails (undef is not a string)
  undef === 3# false (the undef object is not the same as the 3 object)

> 2b. As a pseudo-exception, while undef/unknown values are
> conceptually all unequal to each other, they should all sort
> together; eg, calling sort() on an array of values where some are
> defined and some not, should group all the undefs together.  I leave
> it up to discussion as to whether they should sort before or after
> all the defined values, but one of those choices should be picked for
> predictability.

You're actually saying that undef either compares less than or greater
than all other objects, which contradicts your earlier point.  I'd say
it just fails.

> 5. In any situation where a developer wants an undefined value to
> become a zero or empty string or something else, they should say so
> explicitly, such as with:
>
>   $foo = undef // 0;
>   $bar = undef // '';
>   $baz = undef // $MY_DEFAULT;
>
> The fact is, that in any normal program, using an undefined value as
> if it were a defined one is a bug.  Normally there will be a point
> where such a variable should be tested for definedness and either be
> given a default value explicitly or fail.  Checking your input at the
> gates is good programming practice.

But checking input at the gates is also something you'd like to happen
automatically, or declaratively at the very least.  Thus all of Perl
6's type signature nonsense.

And you're also losing a rather important idiom:

my %seen;
my @q = ($initial);
for @q {
next if $seen{$_}++;
@q.push(.next_nodes);
}

You are also losing autovivification, which is one of Perl's staples.

Actually, you can think of undef pretty much as defining
autovivification.  "If you use it as a number, it becomes a number; if
you use it as a string, it becomes a string; if you use it as a hash,
it becomes a hash; ..."

However, that's not really accurate, because:

# perl 5
my $x;
$x->{4} = 1;
print $x;   # "HASH(...)"

my $x;
my $y = $x + 1;
print $x;   # not "0"

While the premise of this proposal is nice, it feels like it's missing
the big picture.  Undef is what subs use when they fail if the caller
is not under "use fatal".  However, people have been requesting this
sort of dwimmery:

open "foo" err die "Hey, I couldn't open it";
open "foo";  # dies if it fails anyway

It would be nice to see a proposal of undef that handles this sort of
thing cleanly (saying "if a sub returns undef at the top statement
level without being handled then it throws an error" is not clean).

I guess what I'm saying is that it would be cool to make undefs and
exceptions the same thing, and to do away with "use fatal".  That may
be an impossible hope.

Luke


Re: Three more shoot outs

2005-12-17 Thread Joshua Isom
I applied the changes to the code, using capture for the initial strip. 
 I did use \> instead of  but I didn't notice any real difference, 
even when I profiled it.  For the matching, using a capturing regex 
didn't work well because it'd have to backtrace, which slowed it down 
too much for the simplicity.  I just stuck to the coroutine.  Commented 
out is code to use capturing regex to do it for the final substitution. 
 PGE seems faster with the coroutine.


There's a marked improvement in speed.  The one benchmark file that 
took 13 minutes, now gets in under 8.  I haven't tried the full data 
yet, which is a file five times larger.


This is my first real attempt at anything to do with perl6 rules.  I'm 
just learning as I go, and using synopsis 5 for reference.




regexdna.pir
Description: Binary data



On Dec 16, 2005, at 11:35 PM, Patrick R. Michaud wrote:


I don't know all of the details and restrictions of the benchmark,
and I'll be the first to claim that PGE can be slow at times (it has 
very

few optimizations built-in).  But we may have a few tricks available
to try.

First, note that  is a subrule and subrules involve extra
subroutine call overhead (with a lot of setup and take-down).
Using C<< \> >> should be much much much faster, as it's a simple
string comparison.

Instead of repeatedly calling the pattern via "next", I'd
just use an quantified capture and get all of the things to be
stripped all at once.  Thus perhaps something like:

pattern = '[ ( [ \> \N*: ] \n ) | \N*: (\n) ]*'
rulesub = p6rule_compile(pattern)
match = rulesub(seq)

This gives us a single match object, with match[0] as an array
of the captured portions.  We can then just walk through the
captured portions (in reverse order) and remove the substrings--
something like:

.local pmc capt
capt = match[0]# capt is an array of Match
  stripfind:
unless capt goto endstripfind
$P0 = pop capt # remove last capture
$I0 = $P0."from"() # get starting pos
$I1 = $P0."to"()   # get ending pos
$I1 -= $I0 # convert to length
substr seq, $I0, $I1, ''   # remove unwanted portion
goto stripfind
  endstripfind:

Hope this helps at least a little bit.  It's still likely
to be somewhat slow.  We may also be able to get some improvements
by implementing the :g modifier for the repeated captures, and
being able to compile (or use) whole substitutions as opposed to
just rules.

Pm



RE: handling undef better

2005-12-17 Thread Darren Duncan

At 2:27 AM -0500 12/17/05, Gordon Henriksen wrote:

I find it useful to distinguish between unassigned and undefined (null).


I think that the whole point of having "undef" in the first place was 
to indicate when a container wasn't assigned to yet, and hence has no 
useable value.  Also, that explicitly assigning undef to a variable 
is returning it to the state it was before it was first assigned a 
value.  When an undef is used in a larger expression, that is an 
anonymous container which is undefined.


Put another way, both an undef or SQL null are what we call the state 
of a container for which we don't yet know the value we want to store 
in it.


Undef means "don't know", which is distinct from "zero", because in 
the latter case we explicitly have a value of zero.


The fact we have undef as distinct from zero is a huge plus of Perl 
and friends over C, where you have to use some actual number (often 
-1) to mean "this isn't actually a number".  But undef by design is 
outside the domain of numbers, and everything else, distinct.  Very, 
very useful, and I hate to see that compromised.



"None" is very often a valid value, especially for primitive types, and
especially where databases are involved. i.e., the range of a variable might
be {undef, -2^31..2^31-1}.


Yes, and I have no problem with 'none' aka 'no explicit value'.  What 
I have a problem with is undef being considered equal to zero or the 
empty string.



In my experience:

  99 + undef -> 99 # Permissive. Stable. Useful. [Perl]
  99 + undef -> undef  # Pedantic. Error-prone. Annoying. [SQL, C# 2.0]
  99 + undef -> die# Anal retentive. Crash-prone. Enfuriating.
[Obj-C]
  99 + undef is impossible # Ill-advised. Unusable. [C#, C]

I find null propagation frustrating; it's more useful that my code keep data
rather than to throw it away on the theory that "undef means maybe, and
anything combined in any fashion with maybe results in more maybe".


Well, that theory seems the most logical in practice.  The whole 
point of having different words for undef and zero is because they 
mean different things.



I just
wind up writing defined(expr)?expr:0 over and over to avoid throwing away
the other part of the expression.


Your example shows non-strategic means of defaults.  Its much more 
concise to say "expr // 0" instead of "expr.defined ?? expr !! 0"; 
the defaulting adds only 3 characters plus spaces.  The // and //= 
operators were created intentionally so that explicit defaulting can 
be done in a very concise manner.  In fact, they were even 
back-ported to Perl 5.9.x+.


FYI, SQL:2003 has something similar, but with a few more characters, 
"COALESCE"; so Perl's "expr // 0" is SQL:2003's "COALESCE(expr,0)", 
but that COALESCE takes N args, returning the first not-null one, 
like a chain of //; Oracle calls that, or the 2 arg variant anyway, 
NVL().


The key thing is that elegant defaulting solutions exist, elegant 
because  appears exactly once, so they can be taken advantage 
of.



The two third and fourth options are just progressively more destructive
forms of the same logic. Succinctly, 'use crash_on_every_use_of_undef' is an
pragma I'd want to opt out of almost globally.


No, its just crash_on_dirty_code.

And there is the 'no strict undef' pragma if you really want it.


An unassigned variable is very different, and is a compile-time concept.
Static flow control can find accesses of not definitely assigned local
variables, like this:

  my Animal $pet;
  given $kind {
  when 'dog': $dog = new Dog;
  when 'cat': $pet = new Cat;
  when 'none': $pet = undef;
  }
  return $pet;

Static flow control analysis can see that, where $kind not in ('dog', 'cat',
'none'), $pet will not be definitely assigned in the return statement. To
ensure definedness, there must be a default case. Perhaps $pet's
compiler-supplied default value is okay, but the programmer's intent isn't
explicit in the matter. Note that in the case of $kind == 'none', $pet's IS
assigned: It's assigned undef.

While flow control analysis requires some additional work to avoid reliance
on default values, I find that work to be less than the work debugging the
bugs introduced because such checks aren't performed in the first place. It
also allows for very strong guarantees; i.e., "I know this variable cannot
be undefined because I never assign undef to it, and the compiler would tell
me if I accessed it without assigning to it."

This is what 'use strict' should evolve toward, in my mind.


If you see 'never-assigned' and 'assigned-but-unset' to be distinct 
in practical use, then maybe we need to add a method/property to all 
containers that is used like .defined(), such as .unassigned() ... 
but I can't say its needed.


-- Darren Duncan