I'm exporting the data from mySQL table(s) into a dbase DBF table. The
unique index you're talking about should be in the DBF end, if I'm not
mistaken - but I'm not sure how to do that, and if that will help mySQL to
get that error and fail the second insert.
Unless I'm not getting this right.
On 3/26/07 4:13 PM, "Richard Lynch" <[EMAIL PROTECTED]> wrote:
> On Mon, March 26, 2007 2:28 pm, Rahul Sitaram Johari wrote:
>>> Another option would be to just create a UNIQUE INDEX on the fields
>>> you think "should" be unique, and then your second insert is gonna
>>> fail, and you can just ignore that.
>>
>> Could you possibly elaborate on this?
>> Things I'm trying are still not working out the way or want to, or
>> efficiently. So still looking for a solution.
>
> create unique index no_duplicates on whatever(field1, field2, field3);
>
> $query = "insert into whatever (field1, field2, field3)
> values('$field1_sql', '$field2_sql', '$field3_sql')";
> $insert = mysql_query($query, $connection);
> if (!$insert && mysql_errno($connection) == 1062){
> //this is a duplicate insert that failed. do whatever you want here
> }
> elseif (!$insert){
> //something else went wrong with the insert.
> //provide usual debugging error handling here
> }
> else{
> //everything went fine here
> }
>
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php