,
I receive the same seg-fault if the headers are removed. I had initially
tried to import without headers, and added them after the import was
unsuccessful.
--
Eric Patton
[EMAIL PROTECTED]
-Original Message-
From: André Goliath [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 10, 2006 10
First of all, remove the header line from your CSV.
SQLite expects only data, no headers.
-Original Message-
From: Patton, Eric [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 10, 2006 3:04 PM
To: 'sqlite-users@sqlite.org'
Subject: [sqlite] Problem importing csv file
I can't seem to
We had a little talk on that a while ago and Dennis and Jay came up with
some nice links and ideas for mx project,
to be found here
http://www.mail-archive.com/sqlite-users@sqlite.org/msg14503.html
I finally went my own way building a small text indexing system based on
Denis´ suggestions since
Heyo,
sorry for that dump question, but how to escape % chars in a column LIKE
'%the income was 50%%' statement?
I want the second % threated as regular character and the first and last %
as wildcards.
I can´t seem to figure it out,...
Thanks,
André
wow, I really have to learn much more about the SQL, thanks for your great
help!
-Original Message-
From: Dennis Cote [mailto:[EMAIL PROTECTED]
Sent: Tuesday, April 11, 2006 7:13 PM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] How to handle large amount of data?
André Goliath
[mailto:[EMAIL PROTECTED]
Sent: Tuesday, April 11, 2006 1:00 AM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] How to handle large amount of data?
André Goliath wrote:
>Do you (or anyone else) have any experience with those large DBs in terms
of
>SELECT execution times?
>ms, seconds o
: Dennis Cote [mailto:[EMAIL PROTECTED]
Sent: Monday, April 10, 2006 10:23 PM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] How to handle large amount of data?
André Goliath wrote:
>Denis,
>I didn´t knew references where possible in Sqlite 3.2,
>thanks for pointing that out!
>
>
Subject: Re: [sqlite] How to handle large amount of data?
On 4/10/06, André Goliath <[EMAIL PROTECTED]> wrote:
> Good point ;)
>
> I´m a bit branded when it comes to joins since I once had a co-worker that
> created
> an mySQL join resulting in the cartesian product of two 1GB Ta
: Monday, April 10, 2006 8:54 PM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] How to handle large amount of data?
André Goliath wrote:
>HI List,
>
>I´m curently in the design phase of a rather large DB project
>and would love to hear your qualified thoughts on this.
>
Sprenkle [mailto:[EMAIL PROTECTED]
Sent: Monday, April 10, 2006 7:43 PM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] How to handle large amount of data?
On 4/10/06, André Goliath <[EMAIL PROTECTED]> wrote:
> HI List,
>
> I´m curently in the design phase of a rather large DB proj
HI List,
I´m curently in the design phase of a rather large DB project
and would love to hear your qualified thoughts on this.
You can think of the data to be stored as objects with some properties, an
unique ID and then 21 fields which will contain TEXT data,
The DB acts as kind of archieve
11 matches
Mail list logo