On Sat, Apr 10, 2010 at 05:15:18PM +, Alan Millington wrote:
> In C, if I declare a short variable shortvar, I can write an
> assignment shortvar = 1. The literal value 1 is an int, but the
> compiler will implicitly convert it to a short. Similarly, if I write
> a function func() which takes a
In C, if I declare a short variable shortvar, I can write an assignment
shortvar = 1. The literal value 1 is an int, but the compiler will implicitly
convert it to a short. Similarly, if I write a function func() which takes a
short argument, then provided I use function prototypes, I can write
On 9 Apr 2010, at 16:57, Alan Millington wrote:
> However, every language that I have ever used will implicitly convert an
> integer 1 to a smallint (or short) 1 when required to do so. How can such a
> cast be called "surprising behaviour", or produce "misleading results", to
> quote the first
I am performing some array membership operations ( namely @> or <@ )
on large arrays.
One of the arrays in this pair of arrays being compared is contained
in a database field. The other array of this pair will be dynamically
generated from an array intersection activity in another part of the
larg