On Tue, 20 May 2008 08:45:47 +0100
James Urquhart <[EMAIL PROTECTED]> wrote:

> Alban,
> 
> On 18 May 2008, at 15:00, Alban Bedel wrote:
> 
> > On Fri, 16 May 2008 16:03:47 +0100
> > James Urquhart <[EMAIL PROTECTED]> wrote:
> >
> >> Hi all,
> >>
> >> I've been playing about with the haXe-based SCUMM interpreter
> >> recently, and i've pretty much come to the point where i have to
> >> check to see if i am actually processing everything correctly.
> >> Otherwise i'll probably bump into a nasty bug later down the road.
> >>
> >> Currently i am trying to fathom out how variables in SCUMM are
> >> stored at runtime. From what i have been able to gather, they
> >> should be word's, possibly 16 bits in length? However i have not
> >> been able to figure out if they should be signed or unsigned. Now
> >> i know there are quite a few other places in SCUMM where this is
> >> obvious, but i am quite simply stumped on this.
> >
> > Good question, honestly I haven't yet investigated that. My gut
> > guess would be that the stack and variables both use signed 16 bit.
> > However that should be tested, I'll look at that.
> 
> Looking at the ScummVM code, i see:
> int _vmStack[150];
> And...
> int32 *_scummVars;
> Which would make all of the variables *32* bits signed.
> In addition i cannot find any explicit reference to chopping bits
> off so i can only assume the ScummVM developers figured everything
> in SCUMM uses full-size 32bit signed integers. Perhaps.
> (Either that or no LEC SCUMM game ever relies on integer overflow.  
> Which may be so)

I did some test and as I suspected the original VM use signed 16 bit.
For ex:

  int a;
  a = 0x7000;
  a = a*4/5;

will give different result with the dott interpreter and scummvm
because 0x7000*4 overflow with 16 bits but not with 32. I'll submit a
patch for scummvm to correct this.

I also tested the arrays and it seems they are always signed 16 bits
too, no matter what type you use (yes, even the bit arrays). But I
think I will still keep the "proper" size in scvm.

> >> I really need to figure this out as the only integer type i have
> >> available to me in haXe is an Int, which is 32 bits. So i have to
> >> do things such as perform an additional subtraction if i want to
> >> convert an unsigned number into a signed one (after &'ing with
> >> 0xFFFF). :(
> >>
> >> Any suggestions on how to properly handle this would be
> >> appreciated.
> >
> > This need a bit more investigations, but basically it will only
> > matter when converting between types of different size. From a
> > larger to a smaller type truncating (ie. &'ing) is enough. But from
> > a smaller type to a larger one sign extension might be needed.
> 
> Doing a few sanity tests in my browser, i can see that...
> 
> (-100 & 0xFFFF) > (100 & 0xFFFF) == true
> (100 & 0xFFFF) > (-100 & 0xFFFF) == false
> (-100 & 0xFFFF) == (100 & 0xFFFF) == true
> 
> So logically speaking, blindly &'ing the value isn't sufficient as  
> future comparisons will become invalid. Darn.

You are right, what is needed is modulo 0x10000. The and trick only work
with unsigned. As the VM can only load 16 bit values from the code you
only need to apply the modulo after ops that might overflow (+, -, *,
inc, dec).

> If anything i should just store everything with signed 32bit
> integers. But there arises the "What if" with regards to deliberate
> overflow.

Seeing as the bug went unnoticed in scummvm for years, it is probably
not a big deal.

> Regardless, i appreciate you looking into this. My mind is already a  
> mess trying to figure out the in's and out's of this stuff. :)

No problem :)

        Albeu


_______________________________________________
ScummC-general mailing list
[email protected]
https://mail.gna.org/listinfo/scummc-general

Reply via email to