chromatic wrote:
> On Thursday 19 February 2009 04:05:37 Ovid wrote:
> 
>> Properly, if we want to report SKIPs for each test (presumably with
>> numbers), then we want to report failing TODOs with each test for
>> consistency's sake.
> 
> I don't like that.  You can already get this behavior with the --directives 
> flag from prove, and it subverts the point of the TODO directive: run the 
> test, but expect it to fail and don't make a big deal of it.
> 
> If you report failing TODOs (sorry, *passing* TODOs), you might as well 
> report 
> passing oks.

I agree.

The same argument can be made for skips, they are passing tests and passing
tests should be quiet.  Which is what I suspect happened sometime in the past
and now we've forgotten that decision was made.

OTOH normal todo tests have no user serviceable parts.  The author expects
them to fail.  They fail.  There's nothing the user can do about that.
Informing the average user leads to no new action.  It's just clutter.

Skipped tests can sometimes be turned on by the user.  For example...

$ prove -l t/is_deeply_with_threads.t
t/is_deeply_with_threads....skipped: many perls have broken threads.  Enable
with AUTHOR_TESTING.

Or

$ prove -l t/pod.t
t/pod_coverage....skipped: Install Pod::Coverage to run

Those messages are informative for the user and suggest an action to be taken.

Currently prove shows the skip message when the whole test is skipped, but not
when individual tests are skipped.  If there's a consistency problem, that's it.

OTGH I have had more than one novice Perl user wonder what to do about those
skip messages.  Which brings us back to the principle that passing tests
should be quiet.


-- 
...they shared one last kiss that left a bitter yet sweet taste in her
mouth--kind of like throwing up after eating a junior mint.
    -- Dishonorable Mention, 2005 Bulwer-Lytton Fiction Contest
           by Tami Farmer

Reply via email to