On 7/20/16 1:14 PM, Mark Lybarger wrote:
This leads me to think I need to create 2^5 or 32 unique constraints to
handle the various combinations of data that I can store.
Another option would be to create a unique index of a bit varying field
that set a bit to true for each field that was NULL
On Wed, Jul 20, 2016 at 8:14 PM, Mark Lybarger wrote:
> I have a relation such as
> create table order_item ( id uuid not null primary key, order_id number
> not null, item_code text, make text, model text, reason text, size text,
> expiration_date timestamp );
>
> where the combination of the co
Mark Lybarger writes:
> I have a relation such as
> create table order_item ( id uuid not null primary key, order_id number not
> null, item_code text, make text, model text, reason text, size text,
> expiration_date
> timestamp );
>
> where the combination of the columns order_id, item_code, m
On Wed, Jul 20, 2016 at 1:48 PM, David G. Johnston
wrote:
> On Wed, Jul 20, 2016 at 2:14 PM, Mark Lybarger wrote:
>> Another solution I can think of is to just use a trigger to
>> prevent the duplicate rows.
If you go that route you will need to use serializable
transactions, explicit locking,
On Wed, Jul 20, 2016 at 2:14 PM, Mark Lybarger wrote:
> I have a relation such as
> create table order_item ( id uuid not null primary key, order_id number
> not null, item_code text, make text, model text, reason text, size text,
> expiration_date timestamp );
>
> where the combination of the co
I have a relation such as
create table order_item ( id uuid not null primary key, order_id number not
null, item_code text, make text, model text, reason text, size text,
expiration_date timestamp );
where the combination of the columns order_id, item_code, make, model,
reason, size must be unique