On 2/22/2012 12:55 AM, MB Software Solutions, LLC wrote:
> On 2/22/2012 12:42 AM, Michael Oke, II wrote:
>> When dealing with files of that size, I've always done just that. Split the 
>> file into reasonable chunks that I then process for whatever backend it is 
>> destined for.
>
>
> How are you inserting the rows into the final destination backend?
> Let's say you've got a table that's over the 2GB barrier.  You break it
> up into<2GB pieces and then append the data into remote views into your
> backend then do a tableupdate?  Here's a potential gotcha...there's no
> clearly defined PK for these rows.  UGH.
>
>

I was thinking about saving to a CSV file again and then inserting into 
the MySQL table from that, with the fields fixed that had issue prior.

-- 
Mike Babcock, MCP
MB Software Solutions, LLC
President, Chief Software Architect
http://mbsoftwaresolutions.com
http://fabmate.com
http://twitter.com/mbabcock16

_______________________________________________
Post Messages to: ProFox@leafe.com
Subscription Maintenance: http://leafe.com/mailman/listinfo/profox
OT-free version of this list: http://leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message: 
http://leafe.com/archives/byMID/profox/4f448493.9050...@mbsoftwaresolutions.com
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to