On Tue, 16 Nov 2010, Vasiliy Kulikov wrote:
> On Mon, Nov 15, 2010 at 15:23 +0100, Julia Lawall wrote:
> > On Sun, 14 Nov 2010, Vasiliy Kulikov wrote:
> >
> > > On Sat, Nov 13, 2010 at 14:41 +0100, Julia Lawall wrote:
> > > > On Thu, 4 Nov 2010, Vasiliy Kulikov wrote:
> > > > > "any" is all combinations of possible sizeof()s (I mean not breaking
> > > > > 2 <= sizeof(short) <= sizeof(int) <= sizeof(long)).
> > > > >
> > > > > Default value is some sane value for the current architecture, e.g.
> > > > > gcc's choise.
> > > >
> > > > Actually, this was all already implemented. It's even in the manual :)
> > > > -int_bits n and -long_bits n
> > >
> > > But not -size_t_bits: sizeof(long) == 4 < sizeof(size_t) == 8 ==
> > > sizeof(void*) on Win64.
> >
> > What would be done with this information? int_bits etc is being used to
> > classify constant integers. Is the goal that eg 12 should also be
> > inferred to be a size_t, if it is within the right number of bits?
>
> Actually, const int is long long if it overflows long. I thought that
> (s)size_t may be equal to int, long or long long depending on current
> arch and command line arguments.
Sorry, but I'm still not sure what Coccinelle is supposed to do with the
information about the size of size_t. For example, suppose you have the
following rule:
@@
size_t x;
@@
- x
+ FOUND_A_SIZET(x)
And the following code:
int main () {
return 27;
}
Is there any circumstance in which you would expect this to generate:
int main () {
return FOUND_A_SIZET(27);
}
eg with a very small value of -size_t_bits? But this does not seem
desirable, because most constant integers are not sizes.
julia
_______________________________________________
Cocci mailing list
[email protected]
http://lists.diku.dk/mailman/listinfo/cocci
(Web access from inside DIKUs LAN only)