Re: Optimal InnoDB datafile size? 150GB data.

2004-03-26 Thread Kurt Haegeman
Jesper Krogh wrote:

Hi.

I need some recommadations. 

What is the optimal InnoDB datafile size? Is 75 * 2GB prefered over 1 *
1G:autoextend? The filesystem (ext3) has no problems handling the
filesizes, so that should not be an issue.
Of the 150GB is about 100GB BlOB's, would it have any impact on the
overall speed of the database to store this data in the filesystem
instead? 

Thanks. 

 

If your one large datafile suffers irrepairable block corruption, you'll 
have to prepare for a restore of a XXX Gb file, which will take much 
longer than replacing a damaged 2Gb datafile. We suffered from such a 
loss on our Oracle database, and since then we no longer use datafiles 
larger than 2Gb.

Concerning your lob question: if the lobs are in often queried tables 
with other relational data, and you only use the other data for the 
query, then yes, the speed of the database would be better without them 
(smaller tables - more speed).

Greetings,
Kurt Haegeman
Mediargus.be
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]


Re: Optimal InnoDB datafile size? 150GB data.

2004-03-26 Thread Kurt Haegeman
Jesper Krogh wrote:

I gmane.comp.db.mysql.general, skrev Kurt Haegeman:
 

What is the optimal InnoDB datafile size? Is 75 * 2GB prefered over 1 *
1G:autoextend? The filesystem (ext3) has no problems handling the
filesizes, so that should not be an issue.
 

If your one large datafile suffers irrepairable block corruption, you'll 
have to prepare for a restore of a XXX Gb file, which will take much 
longer than replacing a damaged 2Gb datafile. We suffered from such a 
loss on our Oracle database, and since then we no longer use datafiles 
larger than 2Gb.
   

So MySQL dont care wether it's a block of 100GB or several smaller,
regarding speed?
 

Regarding speed: no, MySQL won't run any faster or slower on one large 
datafile compared to several smaller ones. Regarding stability, 
availability, recovery, I'd strongly advice against the use of one 
single extremely large datafile.

Just my two cents, you know. Personal opinion and such.
Kurt Haegeman
Mediargus.be


Re: fulltext indices

2004-03-25 Thread Kurt Haegeman
Brandon Carter wrote:

Is it possible to fit an entire article (say, a newspaper article) into one cell of a MySQL database?  When I tried load data local infile the file was imported into several rows!  Perhaps I just don't understand the use of a fulltext index.

--bhcesl

Do you Yahoo!?
Yahoo! Finance Tax Center - File online. File on time.
 

Absolutely. In an RD environment, I created a table with more than 7 
million newspaper articles. Didn't use load data local infile, though, 
but a fairly simple Perl script.

Regards,
Kurt Haegeman
Mediargus.be
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]


Re: Saving file into database

2004-03-11 Thread Kurt Haegeman
Erich Beyrent wrote:

Use the BLOB, Luke!

See your local MySQL manual for details.

We're using BLOBs to store PDF in our database, and through the use of
   

HTTP 
 

headers, we're able to let user download the PDFs without having to
   

store a 
 

local copy on disk, directly from the database (content-disposition 
header).
   

Hi Kurt,

I have been using MySQL to store links to PDFs which live in other
directories.  

Is there an advantage to storing the PDFs directly into the database?

-Erich-

 

Not all PDF's are viewable for all users. First it is determined whether 
the user can be granted access to the file, before he's presented with a 
hyperlink. This authorization is stored in the database, thus our web 
app can access and present the PDF using the same protocol it used for 
authentication/authorization. It can even use the same open database 
connection. No need to deal with other protocols and 
protection/authorization of those files on file system- and web 
server-level: if you're not authorized, you can't access them in any 
way. Backoffice maintenance and content management can be done with only 
a single database connection. In our case, this resulted in much simpler 
and safer code.

Other advantages I can think of are, amongst other, master/slave 
replication of the database, and no required needs to tune your database 
server for anything else than MySQL activity. Disadvantages which come 
to mind are a much larger database, which is often more difficult to 
maintain than large file systems. To some people, at least. Just ideas, 
food for thought.

Regards,
Kurt.


Re: Saving file into database

2004-03-11 Thread Kurt Haegeman
Eve Atley wrote:

Is there an advantage to storing the PDFs directly into the database?
   

I'm also curious how large this would make a database. Is there any space
saved through this method, or would they still be the same size as the
original PDF?
- Eve

 

There's a percentage of disk space lost to database overhead (headers 
and such). Fragmentation, internal and external, is comparable to 
storage at the file-system level. I don't have exact figures what this 
would mean to MySQL, since our app currently runs on Oracle. Easy enough 
to test this in lab environments, though.

Regards,
Kurt


Re: Saving file into database

2004-03-10 Thread Kurt Haegeman
Use the BLOB, Luke!

See your local MySQL manual for details.

We're using BLOBs to store PDF in our database, and through the use of 
HTTP headers, we're able to let user download the PDFs without having to 
store a local copy on disk, directly from the database 
(content-disposition header).

Hope this helps.
Kurt Haegeman
Mediargus.com
Paul Rigor wrote:

Are you running a web server (or ftp server) as well?  Because if you 
are, then you can upload the files to a separate directory using perl 
and just store the links to that file into a table in your database...

If you're not running a webserver (or ftp)... then lemme konw if you 
get a viable suggestion.

HEre's my 2cents. Since mysql is a relational database, it would be 
difficult to display that particular column/row containing the file 
(esp, binary).  You can use perl (or another converter) to convert the 
binary file into uue (or other text format)... and then import that... 
make sure you remove the linefeeds and store information about the 
column widths of the uue (or other text format) into a table in your 
database.  but geez, if the file is considerably large... like i said, 
it would put a strain on your server. (unless you have GIGS of ram and 
extra processing spd).

good luck,
paul
At 01:49 AM 3/9/2004, Isa Wolt wrote:

Hi,

I would like to save a binary file into a mysql database, for later 
being able to use the file. I am using a perl interafce. Is this at 
all possible???

And would it be possible to then read that file from a c++ interface?

would be greatful for any help/advices!

Isa

_
Hitta rätt på nätet med MSN Sök http://search.msn.se/
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]


_
Paul Rigor
[EMAIL PROTECTED]
Go Bruins!


Re: Fulltext creation on 4.1: ERROR 1034

2004-01-16 Thread Kurt Haegeman
Hi,

Sergei Golubchik wrote:

Hi!

On Jan 13, Kurt Haegeman wrote:
 

Hi,

When trying to create a fulltext index on my large table, I get the 
following error:

ERROR 1034 (HY000): 121 when fixing table
   

Sorry, I still cannot repeat this :(

Could you try to create a smaller test case ?
I would expect that you need only a few rows from your table for this
bug to appear. (of course, finding these exact rows in your gigabytes
could be not easy :)
Regards,
Sergei
 

A smaller test case (2.5M articles) failed also.

mysql alter table articles2
   - add fulltext( text );
ERROR 1034 (HY000): 121 when fixing table
Table check was OK, diskspace check was OK. I'll try again with 1M 
records and let you know the result.

Regards,
Kurt.


Re: Fulltext creation on 4.1: ERROR 1034

2004-01-14 Thread Kurt Haegeman
Hi Eli,

Eli Hen wrote:

Kurt Haegeman [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
 

Sergei Golubchik wrote:

   

Hi!

On Jan 13, Kurt Haegeman wrote:

 

Hi,

When trying to create a fulltext index on my large table, I get the
following error:
ERROR 1034 (HY000): 121 when fixing table

   

Hi Sergei,

alter table articles
add fulltext( text );
After several hours of processing, the error below is generated.

Regards,
Kurt.
   

Did you try to check the table using myisamcheck or CHECK TABLE articles;
???
It might be that your table is corrupted..
 

mysql alter table articles
   - add fulltext( text );
ERROR 1034 (HY000): 121 when fixing table
mysql check table articles;
+---+---+--+--+
| Table | Op| Msg_type | Msg_text |
+---+---+--+--+
| test.articles | check | status   | OK   |
+---+---+--+--+
1 row in set (8 min 33.00 sec)
That's not it, at first sight.

Regards,
Kurt.


Fulltext creation on 4.1: ERROR 1034

2004-01-13 Thread Kurt Haegeman
Hi,

When trying to create a fulltext index on my large table, I get the 
following error:

ERROR 1034 (HY000): 121 when fixing table

I'm using version 4.1.1-alpha of the MySQL database, a source-compiled 
version with the --with-raid option. I'm trying to build a newpaper 
article search engine. I've built an 'articles' table with the following 
DDL:

CREATE TABLE articles (
 filename varchar(40) default NULL,
 source varchar(30) default NULL,
 pubdate varchar(30) default NULL,
 text text
)
TYPE=MyISAM
DEFAULT CHARSET=latin1
MAX_ROWS=1000
AVG_ROW_LENGTH=2366
RAID_TYPE=striped
RAID_CHUNKS=16
RAID_CHUNKSIZE=2048;
I've inserted 7806867 articles in dutch and french into it, which gives 
me a table of about 16Gb, leaving 75+Gb of free space on my Compaq 
Proliant DL380G2, 1.2Gb RAM. The kernel is compiled with HIGHMEM 
support, and the MySQL database is using a cnf based on my-huge.cnf.

Somebody knows where to start looking?

Thanks in advance,
Kurt.



Re: Fulltext creation on 4.1: ERROR 1034

2004-01-13 Thread Kurt Haegeman
Sergei Golubchik wrote:

Hi!

On Jan 13, Kurt Haegeman wrote:
 

Hi,

When trying to create a fulltext index on my large table, I get the 
following error:

ERROR 1034 (HY000): 121 when fixing table

I'm using version 4.1.1-alpha of the MySQL database, a source-compiled 
version with the --with-raid option. I'm trying to build a newpaper 
article search engine. I've built an 'articles' table with the following 
DDL:

CREATE TABLE articles (
filename varchar(40) default NULL,
source varchar(30) default NULL,
pubdate varchar(30) default NULL,
text text
)
TYPE=MyISAM
DEFAULT CHARSET=latin1
MAX_ROWS=1000
AVG_ROW_LENGTH=2366
RAID_TYPE=striped
RAID_CHUNKS=16
RAID_CHUNKSIZE=2048;
I've inserted 7806867 articles in dutch and french into it, which gives 
me a table of about 16Gb, leaving 75+Gb of free space on my Compaq 
Proliant DL380G2, 1.2Gb RAM. The kernel is compiled with HIGHMEM 
support, and the MySQL database is using a cnf based on my-huge.cnf.

Somebody knows where to start looking?
   

What is the exact command that generates en error ?

Regards,
Sergei
 

Hi Sergei,

alter table articles
add fulltext( text );
After several hours of processing, the error below is generated.

Regards,
Kurt.