Re: What to do....

2003-11-14 Thread chromatic
On Fri, 2003-11-14 at 22:23, Rod Adams wrote:

> (If there are others working in the shadows back there, please make 
> yourselves heard.)

Allison Randal, Dan Sugalski, Hugo van der Sanden, and I usually help
out.

> Can apocalypses be something more along the line of scratches on the wall, 
> that then go through some level of deciphering or translation into 
> something closer to English? Are there topics that need brainstorming that 
> this list could take over?

Probably not as such.  The Perl 6 RFC process demonstrated fairly
convincingly that there still needs to be one coherent design that takes
into account all of the various desires and uses.

Larry is shockingly good at that synthesis.  (Just ask Piers Cawley;
he'll wax eloquent on the subject.)  On the other hand, after every
Apocalypse and Exegesis, the discussion here exposes certain confusing
spots and improvements to the vision.  It has to be synthesized first
though.

Or syncretized.

> I certainly don't want the language to loose the internal cohesiveness that 
> all languages need, and am suitably scared of "design by committee"... but 
> I'd like to think that there's something that could be done to help matters.

I'd really like to see people start turning the existing design
documents into story cards and programmer tests for Perl 6.  That'll
make it much easier to implement the thing.

Design decisions have to be broken into individual tasks at some point. 
Sure, some of them will change as we go along.  There's enough there
that can be implemented now, though, without waiting for the big thud of
specifications.   There's plenty of useful work to go around.

Running test cases are *much* easier to implement against than anything
else.

(Hey, it's been working fairly well on the Perl XP Training Wiki: 
http://xptrain.perl-cw.com/).

-- c



What to do....

2003-11-14 Thread Rod Adams
So I've been lingering around p6-language for a few months now, and have 
noticed the following two trends:

1) All of the work forward on p6 design seems to come from either Larry or 
Damian. (If there are others working in the shadows back there, please make 
yourselves heard.) Most, if not all, the discussions of recent has been of 
the form "How does  work in relation to  mentioned in 
?". While meaningful and worthwhile topics all, they do not 
drive the language forward terribly fast.

2) Reality is constantly interrupting Larry and Damian's efforts in rather 
nasty ways.

Taken separately, either of these trends are bothersome.
Taken together, this feels like a problem.
So the next question is, is there anything that can be done to improve matters?

I'm moderately certain that everyone wishes they could do something about 
#2, I'm moderately sure that the p6 community has done as much as they can 
on that account.

So my real question is, is there any way for the community to get together 
and help take some of the load off these two in the design, or is the 
current process the Best We Can Do (tm) and we just need to practice that 
most unvirtuous of things "patience"?

Can apocalypses be something more along the line of scratches on the wall, 
that then go through some level of deciphering or translation into 
something closer to English? Are there topics that need brainstorming that 
this list could take over?

I certainly don't want the language to loose the internal cohesiveness that 
all languages need, and am suitably scared of "design by committee"... but 
I'd like to think that there's something that could be done to help matters.

Comments?
Suggestions?
-- Rod Adams

PS -- I'm willing to commit several hrs a week to the effort.


Re: This week's summary

2003-11-14 Thread Piers Cawley
Leopold Toetsch <[EMAIL PROTECTED]> writes:

> Piers Cawley wrote:
>
>>   "newsub" and implicit registers
>> [...] ops [...] that IMCC needed to
>> track. Leo has a patch in his tree that deals with the issue.
>
> Sorry, my posting seems to have been misleading. The register tracking
> code is in the CVS tree.

I seem to be doing rather well at misrepresenting you in the summaries
recently. Maybe we should make that the new running joke.



Re: Refactoring a test program: advice sought

2003-11-14 Thread Michael G Schwern
On Sat, Nov 15, 2003 at 02:51:26PM +1100, Andrew Savige wrote:
> Michael G Schwern wrote:
> > I use t/lib so the top level t/ directory doesn't get cluttered (and for
> > compatibility with the Perl core which may be important later for A::T).
> 
> Yes, I like that. Should I call it:
>   t/lib/Test/Archive/Tar...
> or:
>   t/lib/Archive/Tar/Test...
> or something else?

It'll never get installed so it doesn't matter.  The only real concern
is that the name doesn't clash with another module that Archive::Tar
might use.

Don't drive yourself nuts over it, its only for AT internal use so the
naming isn't that important.  You'll noticed ExtUtils::MakeMaker uses
MakeMaker::Test instead of ExtUtils::MakeMaker::Test because I didn't want
to type as much.  I might have even gone with MM::Test.  t/lib/AT/Test would 
be fine.  You can always change it later if there's a problem.


> I took a quick look a mod_perl and Template Toolkit (TT).
> TT has a:
>   lib/Template/Test.pm
> which looks wrong to me (should that not be under t/lib instead?).

*shrug*  Maybe they want it to be installed.  Maybe its useful for people
testing programs that use TT.  Maybe they didn't think of t/lib.  Its at 
least a fully formed module with docs and all rather than 
just a little test utility thing or bit of dummy data.


> Not sure, but mod_perl seems to have unbundled the test suite
> into a separate Apache-Test distribution. Again, why should
> that be called Apache-Test rather than Test-Apache?

Apache::Test was originally bundled with mod_perl (still is) but was recently
dual-lifed as a stand-alone dist.  Because everything else in mod_perl is 
in the Apache:: namespace they probably figured it made sense to call it 
Apache::Test instead of Test::Apache.  That's what I figure.


-- 
Michael G Schwern[EMAIL PROTECTED]  http://www.pobox.com/~schwern/
Playstation?  Of course Perl runs on Playstation.
-- Jarkko Hietaniemi


Re: Refactoring a test program: advice sought

2003-11-14 Thread Andrew Savige
Michael G Schwern wrote:
> I use t/lib so the top level t/ directory doesn't get cluttered (and for
> compatibility with the Perl core which may be important later for A::T).

Yes, I like that. Should I call it:
  t/lib/Test/Archive/Tar...
or:
  t/lib/Archive/Tar/Test...
or something else?

I took a quick look a mod_perl and Template Toolkit (TT).
TT has a:
  lib/Template/Test.pm
which looks wrong to me (should that not be under t/lib instead?).

Not sure, but mod_perl seems to have unbundled the test suite
into a separate Apache-Test distribution. Again, why should
that be called Apache-Test rather than Test-Apache?

/-\


http://personals.yahoo.com.au - Yahoo! Personals
New people, new possibilities. FREE for a limited time.


Re: Darwin issues

2003-11-14 Thread Jeff Clites
Hi Chris:

I haven't had any problems such as this on Mac OS X--either 10.2.6 or 
10.3. What is the contents of your "myconfig" file? Here is the 
contents of mine, for comparison:

Summary of my parrot 0.0.13 configuration:
  configdate='Fri Nov 14 18:23:39 2003'
  Platform:
osname=darwin, archname=darwin
jitcapable=1, jitarchname=ppc-darwin,
jitosname=DARWIN, jitcpuarch=ppc
execcapable=1
perl=perl
  Compiler:
cc='cc', ccflags='-g -pipe -pipe -fno-common -no-cpp-precomp 
-DHAS_TELLDIR_PROTOTYPE  -pipe -fno-common -Wno-long-double ',
  Linker and Libraries:
ld='cc', ldflags='  -flat_namespace ',
cc_ldflags='',
libs='-lm'
  Dynamic Linking:
so='.so', ld_shared=' -flat_namespace -bundle -undefined suppress',
ld_shared_flags=''
  Types:
iv=long, intvalsize=4, intsize=4, opcode_t=long, opcode_t_size=4,
ptrsize=4, ptr_alignment=4 byteorder=4321,
nv=double, numvalsize=8, doublesize=8

I'm thinking that there must be something else causing your problem, as 
there are not in general problems building parrot on Mac OS X--for me, 
parrot builds without modification on Mac OS X, and (currently) passes 
all tests via the standard "perl Configure.pl; make; make test" 
sequence. (I'm currently using gcc 3.3 on Panther, but I believe I was 
using gcc 3.1 previously on Jaguar. I see your gcc is marked 
prerelease--it may be that you need an updated version of the dev. 
tools.)

JEff

On Nov 14, 2003, at 5:29 PM, [EMAIL PROTECTED] wrote:

Apple shipped a linker that doesn't work well with a lot of projects 
unless they recognize it.  It requires that the link phase of any 
c/c++ compilation add a -lcc_dynamic flag.  I was able to do a manual 
compilation of many things by adding that flag, but that gets to be 
tedious.  When I was looking for a good spot to type the flag in once 
and then have it just work I tried the main Makefiles LD_FLAGS setting 
and others to no avail...  I have just started with parrot so I don't 
know my way around well enough yet to get a feel for that.

After finishing the main `make` process, I went on to try `make 
test`... It passed most of them except the ones in t/src/ because they 
also needed -lcc_dynamic

If you need more information about my system so you can write test 
with darwin in mind just holler

btw:
gcc (GCC) 3.1 20020420 (prerelease)
Mac OS version 10.2.8 (iBook 800Mhz G3, 512MB RAM)
Chris




Darwin issues

2003-11-14 Thread mooresan
Apple shipped a linker that doesn't work well with a lot of projects 
unless they recognize it.  It requires that the link phase of any c/c++ 
compilation add a -lcc_dynamic flag.  I was able to do a manual 
compilation of many things by adding that flag, but that gets to be 
tedious.  When I was looking for a good spot to type the flag in once 
and then have it just work I tried the main Makefiles LD_FLAGS setting 
and others to no avail...  I have just started with parrot so I don't 
know my way around well enough yet to get a feel for that.

After finishing the main `make` process, I went on to try `make 
test`... It passed most of them except the ones in t/src/ because they 
also needed -lcc_dynamic

If you need more information about my system so you can write test with 
darwin in mind just holler

btw:
gcc (GCC) 3.1 20020420 (prerelease)
Mac OS version 10.2.8 (iBook 800Mhz G3, 512MB RAM)
Chris



Re: Calling conventions. Again

2003-11-14 Thread Tim Bunce
Does C++ style 'name mangling' have any relevance here?

I also had some half-baked thought that a HLL could generate
two entry points for a prototyped sub...

one with the mangled name encoding the expected arguments and types
(p/s/i) for high-speed no-questions-asked-nothing-checked use, and...

also generate a non-mangled entry point that would check/massage
the params and then call the name-mangled entry point.

Am I totally mad?

Tim [ducking]


[Commit] String iterator

2003-11-14 Thread Peter Gibbs
First draft of a string iterator has been committed. This is currently
only used by the hash_string_equal function; usage will be extended
shortly to other character loops.

Performance enhancement ended up less than preliminary tests indicated,
but anything is better than nothing!

The hash-utf8 benchmark went from approx. 4.01/5.88 seconds to 3.97/5.31
seconds on my system.

The behaviour of the decode_and_advance functions in the various
encodings still needs proper testing; this will follow during the
weekend.

Regards
Peter Gibbs
EmKel Systems



Re: Calling conventions. Again

2003-11-14 Thread Matt Fowles
All~

This might have already been suggested, or their might be a good reason 
why not to do this but here is my idea.

Why not have unprotyped calls pass an array in P5 and a hash in P6?  The 
array can hold the first n positional arguments (possibly 0, for which 
Null could be passed to avoid creating the array) and the hash can hold 
the remaining m positional or name based arguments (also possibly 0).

Prototyped calls, then pass things in the exact registers they need 
I5-I15 for integers, S5-S15 for strings, etc...  and protyped calls can 
only be made when the full types are EXACT (i.e. no default values, or 
anything else...).

This allows the speed freaks to have a fast prototyped call with no 
runtime checks, and us mere mortals can go the slow way...

Matt



Re: [perl #24489] intor.pod contains a slight error.

2003-11-14 Thread chromatic
On Thu, 2003-11-13 at 01:04, [EMAIL PROTECTED] (via RT) wrote:

> I hope this is the correct place to send this.
> 
> intro.pod contains an error in one of the examples.

Thanks, applied!

-- c



Re: Calling conventions. Again

2003-11-14 Thread Melvin Smith
At 05:23 PM 11/14/2003 +0100, Leopold Toetsch wrote:
Dan Sugalski <[EMAIL PROTECTED]> wrote:
> ... It happens, in some cases a *lot*. This is perl,
> python, and ruby we're talking about, where changing the definition of a
> sub is as trivial as a reference assignment into a global hash. It's easy,
> people do it. Often, in some cases. (Heck, I've done it)
Its definitely simpler then changing libc, yes. So if its necessary, do
die nicely instead of segfault, then I'm for a compile option, to be
able to turn these tests off --cause-i-know-what-im-doing.
> Methods also cause significant headaches, since there are *no* signatures
> available to the calling code, as there's no way for it to look up the
> signature.
If there are prototypes - which could be optional - we could check at
least at compile time.
>> e.g. version checking at program load.

> Which doesn't solve the problem.

No, doesn't solve. But param counts don't either. Both may help a bit.
Dan, I think we will still _need_ to support version based deprecation.
Certainly situations will arise with the current scenario when
libraries change in such a way that they don't even detect incompatibilities.
The easy situation is when argument counts change, but the hard situation is
when semantics have changed. In that case we have to have some
sort of version requirement in the bytecode.
-Melvin




Re: Calling conventions. Again

2003-11-14 Thread Leopold Toetsch
Dan Sugalski <[EMAIL PROTECTED]> wrote:
> ... It happens, in some cases a *lot*. This is perl,
> python, and ruby we're talking about, where changing the definition of a
> sub is as trivial as a reference assignment into a global hash. It's easy,
> people do it. Often, in some cases. (Heck, I've done it)

Its definitely simpler then changing libc, yes. So if its necessary, do
die nicely instead of segfault, then I'm for a compile option, to be
able to turn these tests off --cause-i-know-what-im-doing.

> Methods also cause significant headaches, since there are *no* signatures
> available to the calling code, as there's no way for it to look up the
> signature.

If there are prototypes - which could be optional - we could check at
least at compile time.

>> e.g. version checking at program load.

> Which doesn't solve the problem.

No, doesn't solve. But param counts don't either. Both may help a bit.

>   Dan

leo


Re: Calling conventions. Again

2003-11-14 Thread Dan Sugalski
On Fri, 14 Nov 2003, Leopold Toetsch wrote:

> Dan Sugalski <[EMAIL PROTECTED]> wrote:
>
> > I've seen it with some depressing regularity over the years. It generally
> > takes the form of an upgrade to a library that breaks existing
> > executables, something we're going to have to deal with as we're looking
> > to encourage long-term use of bytecode-compiled programs.
>
> This seems to me the same, as that strcpy(3) should be guarded against
> wrong argument count at runtime. But swapped destination/source can't be
> detected anyway ;)

We can't detect bugs like that, true. But we can detect when someone calls
us with two arguments and someone has, in the mean time, "helpfully" added
an optional length arg to strcpy.

> > ... But there are
> > several issues here:
>
> > 1) vararg calls with non-pmc registers involved
>
> I already did propose the syntax:
[Snip]
> at least, so that runtime checks can be omitted for certain cases.

No IMCC syntax that's purely compile-time is of any help here. The code
doing the calling

> > 2) Runtime modification of sub definitions
>
> are evil, forbidden, disallowed. This just can't work.

True, false, false. It happens, in some cases a *lot*. This is perl,
python, and ruby we're talking about, where changing the definition of a
sub is as trivial as a reference assignment into a global hash. It's easy,
people do it. Often, in some cases. (Heck, I've done it)

Methods also cause significant headaches, since there are *no* signatures
available to the calling code, as there's no way for it to look up the
signature. (And yeah, that's a reasonable argument for all method calls to
be unprototyped, but I'm not sure I want to place that restriction in
right now)

> > 3) Drift in interface definitions
>
> needs code adaption and recompiling.

To operate properly, yes. To fail properly, no.

> > ... Because of that we have to pass in sufficient information to
> > validate things at the interface, which means at least arg counts.
>
> If someone changes libc behind the curtain in an incompatible way,
> existing programs probably just segfault.

Yes, they do. For us that's unacceptable--we have to be able to let code
provide at least some boundary guarantees with safe failure modes.

> > If someone wants to propose we have an alternate, more static convention
> > that lends itself better to one-off static linking with link-time
> > signature checking for verification, which is what the complaints all seem
> > to allde to, well... go ahead and if you do we'll see where we go from
> > there.
>
> e.g. version checking at program load.

Which doesn't solve the problem. Ask for version 1.20 or higher, get
version 1.33, and find the interface has changed. (An interface that was
fine in versions 1.20 through 1.32) This happens, with some frequency.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk



Re: Calling conventions. Again

2003-11-14 Thread Leopold Toetsch
Dan Sugalski <[EMAIL PROTECTED]> wrote:

> I've seen it with some depressing regularity over the years. It generally
> takes the form of an upgrade to a library that breaks existing
> executables, something we're going to have to deal with as we're looking
> to encourage long-term use of bytecode-compiled programs.

This seems to me the same, as that strcpy(3) should be guarded against
wrong argument count at runtime. But swapped destination/source can't be
detected anyway ;)

> ... But there are
> several issues here:

> 1) vararg calls with non-pmc registers involved

I already did propose the syntax:

  .sub prototyped var_args

If the HLL can't provide this information, we could use the opposite:

  .sub prototyped fixed_args

at least, so that runtime checks can be omitted for certain cases.

> 2) Runtime modification of sub definitions

are evil, forbidden, disallowed. This just can't work.

> 3) Drift in interface definitions

needs code adaption and recompiling.

> ... Because of that we have to pass in sufficient information to
> validate things at the interface, which means at least arg counts.

If someone changes libc behind the curtain in an incompatible way,
existing programs probably just segfault.

> If someone wants to propose we have an alternate, more static convention
> that lends itself better to one-off static linking with link-time
> signature checking for verification, which is what the complaints all seem
> to allde to, well... go ahead and if you do we'll see where we go from
> there.

e.g. version checking at program load. The main program has something
like:

  load_lib "mylib", "0.22"

If the lib version doesn't match, we spit out a warning. If the lib is
compatible the code can get adjusted to read:

  load_lib "mylib", "0.22-0.24"

There could be some implicit rules, that state, e.g. versions with the
same major version numbers are compatible by default. Module authors and
users have responsibilities, which we can't solve with runtime checks.

>   Dan

leo


Re: Calling conventions. Again

2003-11-14 Thread Dan Sugalski
On Thu, 13 Nov 2003, Pete Lomax wrote:

> I'd be interested to see what sort of signature changes between
> compile and runtime you think are likely to happen, as I have to admit
> I have never encountered such a beast. Doesn't that force
> non-prototyped calls?

I've seen it with some depressing regularity over the years. It generally
takes the form of an upgrade to a library that breaks existing
executables, something we're going to have to deal with as we're looking
to encourage long-term use of bytecode-compiled programs. But there are
several issues here:

1) vararg calls with non-pmc registers involved
2) Runtime modification of sub definitions
3) Drift in interface definitions

There are definite performance issues--there are at least four integer
stores and for paranoid subs for integer comparisons. The calling
conventions make it reasonably clear, though, that they're there because
definitions may change, and the engine doesn't place restrictions on how
they change. Because of that we have to pass in sufficient information to
validate things at the interface, which means at least arg counts.

It's easy to lose sight of the characteristics of our target languages
since we don't have any fully-functional compilers for them yet, so we've
got to be careful. Dynamism is fundamental to the engine and the calling
conventions are a recognition of that fact. (Doesn't matter whether I
*like* it or not, it's the reality we have to deal with)

If someone wants to propose we have an alternate, more static convention
that lends itself better to one-off static linking with link-time
signature checking for verification, which is what the complaints all seem
to allde to, well... go ahead and if you do we'll see where we go from
there.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk



Re: Review of a book about VM

2003-11-14 Thread Stéphane Payrard
On Fri, Nov 14, 2003 at 11:49:51AM -, Peter Cooper wrote:
> "Stéphane Payrard" <[EMAIL PROTECTED]> wrote:
> > I have bought "Virtual Machine Design and Implementation in C++"
> > by Bill Blunden. This book has very positive reviews (see
> > slashdot or amazon.com). It seems to impress people by the
> > apparent width of covered topics. Most of it is off topic. The
> > book gives to the moderately knowledgeable reader no insight
> > about virtual machines or even about language compilation.
> 
> I reviewed it for Slashdot, and posted a mini-review here too. I saw the
> technical and design limitations, but enjoyed it none-the-less. On
> reflection, perhaps a more accurate title for the book would have been
> "Design and Implementation of /A/ Virtual Machine in C/C++".
> [clipped]

My bad opinion was so forceful that I thought honest to mention that
it was not shared by many people.

You posted your mini-review to London.pm:
http://kochi.etla.org/london.pm/html/2002/06/msg00076.html

Piers Cawley will be happy to notice that Leon Brocard anwered in
advance the question on the present trhead that asked about a good
book on virtual machines: "No, there's suprisingly little out
there on virtual machine design and development."

--
 stef

> 
> Regards,
> Peter Cooper
> 
> 


Re: Word for the day: Undocumentation

2003-11-14 Thread Dan Sugalski
On Fri, 14 Nov 2003, Harry Jackson wrote:

> I have also been unable to find out if there is any sort of methodolgy
> to the testing. I have had a look through ./parrot/t/* and found a lot
> of test files but very little actual details on what each test was
> testing. I could infer from the code what each test was trying to
> achieve but some docs would be nice. If there are more docs can someone
> point me at them (I have read most of ./parrot/docs/*.pod).

Many of the tests in the t/ directory are there to test very specific,
low-level things--making sure we can create a PMC of some particular type,
or that an op works as documented. Some of the tests are a bit more
elaborate, trying to exercise full subsystems, but for the most part we're
making sure the tiny pieces work. (Which is good, as the big bits can't
work if the little ones don't)

We still don't cover all the extant op variants, and we could certainly
use larger and more abusive tests, so anything you're interested in
writing would be much appreciated.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk



Re: Review of a book about VM

2003-11-14 Thread Peter Cooper
"Stéphane Payrard" <[EMAIL PROTECTED]> wrote:
> I have bought "Virtual Machine Design and Implementation in C++"
> by Bill Blunden. This book has very positive reviews (see
> slashdot or amazon.com). It seems to impress people by the
> apparent width of covered topics. Most of it is off topic. The
> book gives to the moderately knowledgeable reader no insight
> about virtual machines or even about language compilation.

I reviewed it for Slashdot, and posted a mini-review here too. I saw the
technical and design limitations, but enjoyed it none-the-less. On
reflection, perhaps a more accurate title for the book would have been
"Design and Implementation of /A/ Virtual Machine in C/C++".

I don't think the book's target audience was those who want to become
authorities on virtual machines, or learn about every advanced topic. I
think Bill wanted to catch the people who come from a totally different area
of computer science, who want to get an insight into virtual machine design
and implementation at a basic level. The book certainly achieves something
for those people, as it did for me, and whets the appetite to go and learn
more from real 'hands on' projects like Parrot.

The knowledge you can get from the book, however, is more than enough to
have a far better understanding of how Parrot works and operates than before
you read it. At least, I found this to be the case. I also found the lack of
information on jitting to be a shame, but it's not a hard topic to pick up
externally after reading and understanding the rest.

As a view into how man chose to design and implement /a/ virtual machine, I
consider the book invaluable. He certainly does quite a few things wrong,
and has limitations in his design, but someone who is competent will
recognize this, as you have, and use it as a learning and mind-expanding
experience. Not everyone who has studied computer science will have covered
all of the topics demonstrated in this book.

Regards,
Peter Cooper




Re: Word for the day: Undocumentation

2003-11-14 Thread Harry Jackson
Forgive me if I am looking in the wrong place for some of this stuff. I
am quite new to this.

--- Michael Scott <[EMAIL PROTECTED]> wrote:
> 
> I'm fine with that, I understand why - this is not a rant - but I do 
> think that Parrot has a steep learning curve and that good 
> documentation is essential if we want to lower it. The potential 
> benefits seem obvious.

I had a read through the intro.pod and decided that I might be able to
write some tests but I am having a hell of a time trying to find out
what tests have been written and which ones have not. I have written a
few _simple_ tests and deliberately broken a few others and I would
like to contribute some but I have no idea what needs doing. 

I have also been unable to find out if there is any sort of methodolgy
to the testing. I have had a look through ./parrot/t/* and found a lot
of test files but very little actual details on what each test was
testing. I could infer from the code what each test was trying to
achieve but some docs would be nice. If there are more docs can someone
point me at them (I have read most of ./parrot/docs/*.pod).

After all that I suppose I should volunteer for something. I have some
time on my hands at the moment and would like to get involved in some
fashion. Unfortunately I am not a C guru but I am quite happy to write
tests[0] in assembler or do documentation. In which areas do people
think documentation or tests are most needed, I would be happy to start
with the docs first until I am more comfortable with the code, ideas,
advice?

H

[0] As soon as I am comfortable with the assembler, most of the easier
tests seem to have been written.

__
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
http://antispam.yahoo.com/whatsnewfree


Re: Word for the day: Undocumentation

2003-11-14 Thread Harry Jackson

Forgive me if I am looking in the wrong place for some of this stuff. I
only started looking at this today.

--- Michael Scott <[EMAIL PROTECTED]> wrote:
> 
> I'm fine with that, I understand why - this is not a rant - but I do 
> think that Parrot has a steep learning curve and that good 
> documentation is essential if we want to lower it. The potential 
> benefits seem obvious.

I had a read through intro.pod (found a very minor error, patch
submitted ) and decided that I might be able to write some tests but I
am having a hell of a time trying to find out what tests have been
written and which havn't. I have written a few _simple_ tests and
deliberately broken a few others and I would like to write a few but I
have no idea what needs doing. 

I have also been unable to find out if there is any sort of methodolgy
to the testing. I have had a look through ./parrot/t/* and found a lot
of test files but very little actual details on what each test was
testing. I could infer from the code what most of the tests are trying
to achieve but some docs would be nice. If there are more docs can
someone point me at them (I have read most of ./parrot/docs/*.pod) and
any other pod I have been able to find.

After all that I suppose I should volunteer some time. I have some time
on my hands at the moment and would like to get involved in some
fashion. Unfortunately I am not a C guru but I am quite happy to write
tests[0] in assembler or do documentation. In what areas do people
think documentation or tests are most needed, I would be happy to start
with the docs first until I am more comfortable with the code, ideas,
advice?

H

[0] As soon as I am comfortable with the assembler, most of the easier
tests seem to have been written.

__
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
http://antispam.yahoo.com/whatsnewfree


Re: Calling conventions. Again

2003-11-14 Thread Leopold Toetsch
Pete Lomax <[EMAIL PROTECTED]> wrote:
> On Fri, 14 Nov 2003 08:12:26 +0100, Leopold Toetsch <[EMAIL PROTECTED]>
> wrote:

>>> _u_fred:
>>> I5=P3[1]
>>> S5=P3[2]
>>> _fred:
>>
>>There is no P3[] involved. "_fred" just starts with whatever is in
>>registers I5/S5.
> Yes, "_fred" wades straight in, expecting everything to be set up. It
> is _u_fred which is sucking them out of P3, and falling through.

P3 (*overflow* array) is only used if there are more then 11 arguments
of one kind. For an unprototyped call P5, P6 would be used in your
example.

> Pete

leo


Re: Calling conventions. Again

2003-11-14 Thread Pete Lomax
On Fri, 14 Nov 2003 08:12:26 +0100, Leopold Toetsch <[EMAIL PROTECTED]>
wrote:

>>  _u_fred:
>>  I5=P3[1]
>>  S5=P3[2]
>>  _fred:
>
>There is no P3[] involved. "_fred" just starts with whatever is in
>registers I5/S5.
Yes, "_fred" wades straight in, expecting everything to be set up. It
is _u_fred which is sucking them out of P3, and falling through.
If the function is always called prototyped, _u_fred won't be
referenced and imcc will strip that code (according to the Pirate
document, iirc). If it is called non prototyped from several places,
it makes a smaller footprint.
>
>Please read my proposal WRT default params posted elsewhere in this thread.
>The default value could be the result of an arbitrary expression, so its
>not that simple.
On re-reading your post it seems I was trying to say the same thing.

Pete


Re: Calling conventions. Again

2003-11-14 Thread Leopold Toetsch
Pete Lomax <[EMAIL PROTECTED]> wrote:
> Hi,
> New to this list, so please excuse any glaring stupidity.

Welcome here.

> I'd be interested to see what sort of signature changes between
> compile and runtime you think are likely to happen, as I have to admit
> I have never encountered such a beast. Doesn't that force
> non-prototyped calls?

*Me* thinks, Dan is considering calling into old and possibly outdated
libraries. Trying to catch such errors at subroutine level seems
strange.

>   procedure fred(integer x, string y)

>   _u_fred:
>   I5=P3[1]
>   S5=P3[2]
>   _fred:

There is no P3[] involved. "_fred" just starts with whatever is in
registers I5/S5.

> For optional parameter handling, I will do something like:

Please read my proposal WRT default params posted elsewhere in this thread.
The default value could be the result of an arbitrary expression, so its
not that simple.

> Pete

leo