Hi
I found some strange column alias behaviour:
select val1+val2 as val
from some_table
group by val;
result - OK
select val1+val2 as val
from some_table
order by val;
result - OK
select val1+val2 as val
from some_table
group by val having val1+val21;
result - OK
select val1+val2 as val
from
Hi all,
I have the following firewall connection data.
datetime | protocol | port | inside_ip| outside_ip
| outbound_count | outbound_bytes
-+--+---++--
--++---
2004-05-05 05:00:00 |
On Thu, May 27, 2004 at 11:14:58 +,
Willem de Jong [EMAIL PROTECTED] wrote:
If i do a sum(time) the result is like this '1 day 18:00:00'. But i'd
like to get a result like this '42:00:00'.
How can i realise is by a query?
You can do something like the following: (not completely
PostgreSQL 7.4.2 ...
Background: I'm attempting to migrate tables which were created in the
pre-schema days to a sensible schema setup. I'm using the uniqueidentifier
column in some of these tables. When I created the new schema, I created an
instance of uniqueidentifier and its supporting
Chris Gamache [EMAIL PROTECTED] writes:
I'm using the uniqueidentifier column in some of these tables. When
I created the new schema, I created an instance of uniqueidentifier
and its supporting functions and casts within the new schema. When I
try to INSERT INTO myschema.mytable ... SELECT
Hi ,
After adding a primary key in one of the participant tables
the query never finishes. The live table has a primary key
so it cannot be removed. I made a copy of the live table
using create table t_a as select * from tab. the query works
fine . when i ad the pkey like i have in the live
[EMAIL PROTECTED] writes:
tradein_clients=# explain analyze select email_id ,email ,contact from
t_a a join email_source f using(email_id) join email_subscriptions h
using(email_id) where 1=1 and f.source_id =1 and h.sub_id = 3 ;
Runs for Ever.
So what does plain explain say about it?
Sorry if this is confusing, it is somewhat difficult to explain.
I find myself frequently creating solutions to the same problem. I'm not
yet happy with the way I've done any of them and I'd like to find a purely
SQL way of doing this if possible.
Here's what I have. For a contrived
Ideally, I'd like to figure out a single SQL query that can be run
afterwards to clean up the dsply_order to make sure that each number occurs
only one time and that there are no gaps.
Well... by far the easiest way to approach this is not to clean up the
gaps. Removing gaps will only make