DaNieL wrote:
Hi guys, im tryin to optimize a simple table, suited for contain
users.
So, my table at the moment is:

-----
CREATE TABLE contacts(
 id BIGSERIAL PRIMARY KEY NOT NULL UNIQUE,
 company_id BIGINT,
 code varchar(10),
 company_name varchar(120),
 name varchar(120),
 surname varchar(120),
 phone varchar(80),
 email varchar(80),
 kind varchar(8)
);
-----

I use this layout in order to store 3 kind of users: Private, Company
and Company's Employee.. the col 'kind' infact will contain just
'private', 'company' or 'employee', but is unnecessary, i can
understand what kind a user is by those rules:
Private are stand-alone users, company_name and company_id are always
NULL;
Company have the company_name;
Employees have the company name and the company_id (with the id of the
company's row);

The layout looks find although i would not use Bigserial unless you expect to exceed 2.1 billion records

Example:

id|company_id|code| company_name|name|surname|phone|
email               |   kind
1 |NULL         |C001| Sunday Inc.     | John | Doe      |88888 |
j...@sunday.com | company
2 | 1              |E001| Sunday Inc.     |Paul  | Smith   | 77777|
sm...@sunday.com| employee
3 | NULL        |P001| NULL              | Rose | Mary   | 66666|
r...@mary.com |  private

So, first of all, does this layout looks good?
Before i used to keep employees in a different table, becose usually
employees have just few data (name, surname, direct email and direct
phone.. all the addresses, bank data, etc.. belongs tot he company),
but noe i preferred this way to avoid constant inner joins.

Now, i aspect that my users will search the contact table just for the
fields company_name, name, surname, email, code.
That kind of query cant be as
WHERE company_name = '$x'
but will be much like
WHERE company_name LIKE '$x%',
both becose i use an autocomplete field for the quick search, both
becose.. well, that's how users search data's (in my experience).

What i have done with searches on small strings where the user is unsure what they are looking for or the spelling, I do something like this

where substr(company_name,1,length($searchtext$ UserSearchString $searchtext$)) ilike $searchtext$UserSearchString $searchtext$

and mix it with soundex. This way the user get a list of possible matches with only handful to type characters

Draw back is this type of search is it can't be indexed.
So i created those index, to let the query planner use the indexes in
the LIKE query:

My doubt is: am i using too many indexes?
Will my insert/delete/update queryes be too slow, and does the select
optimization worth the price? (and, does this way really optimize the
selects queryes?)

The more indexes you have the slower updates will be. Yet not a horrible amount. The answer to this is it depends on the work load can the system suffer the overhead of the indexes and still give adequate results on queries.
Consider that my application wont do many insert-delete-update
sequentially.


--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to