Good to know, thanks.

W dniu poniedziałek, 10 czerwca 2013 17:35:04 UTC+2 użytkownik Niphlod 
napisał:
>
> sorry to interrupt your monologue but, BTW, web2py's table definition has 
> no notion of compound unique indexes, so the only way to enforce them (if 
> you need those) is to alter the table manually (as you did).
>
> Il giorno lunedì 10 giugno 2013 14:51:05 UTC+2, lesssugar ha scritto:
>>
>> OK, found a way around. I executed the following commands in postgres and 
>> the CSV file imported correctly:
>>
>> ALTER TABLE city DROP CONSTRAINT city_country_key;
>> ALTER TABLE city ADD CONSTRAINT city_country_key UNIQUE (name, country,region
>> );
>>
>> Cheers.
>>
>> On Monday, June 10, 2013 1:36:47 PM UTC+2, lesssugar wrote:
>>>
>>> Further with my monolog... Seems I need to implement this im my table 
>>> definition:
>>>
>>> UNIQUE KEY `country` USING BTREE (`country`,`name`,`region`)
>>>
>>> How do I do it in web2py?
>>>
>>>
>>> On Monday, June 10, 2013 1:24:20 PM UTC+2, lesssugar wrote:
>>>>
>>>> Yeah, the question is: how 'country' can be a UNIQUE KEY if countries 
>>>> have hundreds of cities. In the table, hundreds of cities will belong to 
>>>> the same country, so the no wonder the contry code (which is two letters) 
>>>> will not be unique. I'm confused.
>>>>
>>>> On Monday, June 10, 2013 12:59:26 PM UTC+2, lesssugar wrote:
>>>>>
>>>>> OK, I'm starting to think it's a wrongly defined table. The table I 
>>>>> would like to import has following structure:
>>>>>
>>>>> CREATE TABLE `cities` (
>>>>>   `ID` int(8) unsigned NOT NULL auto_increment,
>>>>>   `country` char(2) NOT NULL,
>>>>>   `region` char(3) NOT NULL,
>>>>>   `url` varchar(50) NOT NULL,
>>>>>   `name` varchar(50) NOT NULL,
>>>>>   `latitude` double NOT NULL,
>>>>>   `longitude` double NOT NULL,
>>>>>   PRIMARY KEY  (`ID`),
>>>>>   UNIQUE KEY `country` USING BTREE (`country`,`name`,`region`)
>>>>> ) ENGINE=InnoDB AUTO_INCREMENT=207637 DEFAULT CHARSET=utf8;
>>>>>
>>>>> My 'city' table definition looks like this:
>>>>>
>>>>> db.define_table('city',
>>>>> Field('id', length=8),
>>>>> Field('country', length=2, unique=True),
>>>>> Field('region', length=3),
>>>>> Field('url', length=50),
>>>>> Field('name', length=50),
>>>>> Field('latitude', 'double'),
>>>>> Field('longitude', 'double'),
>>>>> format=lambda r: r.name
>>>>> )
>>>>>
>>>>> Does someone see an error here?
>>>>>
>>>>> On Monday, June 10, 2013 12:35:23 AM UTC+2, lesssugar wrote:
>>>>>>
>>>>>> I'm aware this might be rather a postgresql question but maybe 
>>>>>> someone knows how to deal with it.
>>>>>>
>>>>>> I defined 'city' table
>>>>>>
>>>>>> db.define_table('city',
>>>>>> Field('id', length=8),
>>>>>> Field('country', length=2, unique=True),
>>>>>> Field('region', length=3),
>>>>>> Field('url', length=50),
>>>>>> Field('name', length=50),
>>>>>> Field('latitude', 'double'),
>>>>>> Field('longitude', 'double'),
>>>>>> format=lambda r: r.name
>>>>>> )
>>>>>>
>>>>>> The table is empty and I'm trying to import data from CSV file using 
>>>>>> web2py's appadmin (the CSV provides fixed IDs).
>>>>>>
>>>>>> Every time I import the file I get the following error:
>>>>>>
>>>>>> *duplicate key value violates unique constraint "city_country_key"'*
>>>>>> *
>>>>>> *
>>>>>> I found many similar problems on the net and one of the reasons 
>>>>>> causing this error was lack of sync between (in my case) 'city' and 
>>>>>> 'city_id_seq'. However the sync seems to be OK, because:
>>>>>>
>>>>>> A. select max(id) from city; -- returns nothing (the table 'city' is 
>>>>>> empty as I wrote)
>>>>>> B. select nextval('city_id_seq'); -- returns an integer value
>>>>>>
>>>>>> So the result of B. > the result of A. which would suggest it's not a 
>>>>>> sync problem. I've been fighting with this all day. If someone knows 
>>>>>> what's 
>>>>>> going on, please let me know.
>>>>>>
>>>>>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to