the variable-length field(s), will
>speed up significantly.
>
>Thanks for providing some 'real' numbers. That was of interest.
>=dn
>
>
>- Original Message -
>From: "Dobromir Velev" <[EMAIL PROTECTED]>
>To: <[EMAIL PROTECTED]>
>Sent:
ers. That was of interest.
=dn
- Original Message -
From: "Dobromir Velev" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: 25 January 2002 16:11
Subject: RE: Tighly packed table
> Hi,
> If your column is of type VARCHAR, you want save much space (at least not as
;>>I'm wondering just how much space this 'little' exercise is going to
>>>save, either as a ratio of the size of the
>>>db, or as a ratio of HDD size?
>>>
>>>My glass is half-empty!
>>>=dn
>>>
>>>
>>>--
-
From: DL Neil [mailto:[EMAIL PROTECTED]]
Sent: Saturday, January 26, 2002 5:39 PM
To: Michael Stearne
Cc: Michael Stearne; Roger Karnouk; [EMAIL PROTECTED]
Subject: Re: Tighly packed table
Michael,
Let's round it up to 3 million rows (I'm lazy at math too!)
Let's say you cur
> > save, either as a ratio of the size of the
> > db, or as a ratio of HDD size?
> >
> > My glass is half-empty!
> > =dn
> >
> >
> > - Original Message -
> > From: "Michael Stearne" <[EMAIL PROTECTED]>
> > To: &quo
Christopher,
Friday, January 25, 2002, 1:05:02 AM, you wrote:
[max length of a string]
CT> you may want to do this programatically in a small C++
CT> (or C, or whatever) program. It _may_ be faster. It may not, there'll be
CT> a lot of network traffic. SQL servers tend to be rather bad at s
: "Michael Stearne" <[EMAIL PROTECTED]>
> To: "Roger Karnouk" <[EMAIL PROTECTED]>
> Cc: <[EMAIL PROTECTED]>
> Sent: 24 January 2002 22:58
> Subject: Re: Tighly packed table
>
>
>> The problem is, this query really hurts (I don't kno
oger Karnouk" <[EMAIL PROTECTED]>
Cc: <[EMAIL PROTECTED]>
Sent: 24 January 2002 22:58
Subject: Re: Tighly packed table
> The problem is, this query really hurts (I don't know if it finishes)
> for unindexed field for 2.9 million rows. But I'm sure it will fini
I actually have all the records in a 1GB text file, so here comes perl
to the rescue!! (Easier than C to me, maybe I'll do it in Java as an
exercise.)
BTW if anyone has any questions on the ability of MacOS X with medium
size DBs. This DB is running on a 500Mhz iMac G3 with 1GB of RAM and
M
At 05:58 PM 1/24/2002 -0500, Michael Stearne wrote:
>The problem is, this query really hurts (I don't know if it finishes) for
>unindexed field for 2.9 million rows. But I'm sure it will finish eventually.
Yes, it will really hurt. After all, there's no way for MySQL to do this
other than pul
ichael Stearne [mailto:[EMAIL PROTECTED]]
>Sent: Thursday, January 24, 2002 4:38 PM
>To: Christopher Thompson
>Cc: [EMAIL PROTECTED]
>Subject: Re: Tighly packed table
>
>
>Christopher Thompson wrote:
>
>>At 04:10 PM 1/24/2002 -0500, Michael Stearne wrote:
>>
>>
IL PROTECTED]]
>Sent: Thursday, January 24, 2002 4:38 PM
>To: Christopher Thompson
>Cc: [EMAIL PROTECTED]
>Subject: Re: Tighly packed table
>
>
>Christopher Thompson wrote:
>
> > At 04:10 PM 1/24/2002 -0500, Michael Stearne wrote:
> >
> >> We have a somewh
select max(length(firstname)) from TableName;
-Original Message-
From: Michael Stearne [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 24, 2002 4:38 PM
To: Christopher Thompson
Cc: [EMAIL PROTECTED]
Subject: Re: Tighly packed table
Christopher Thompson wrote:
> At 04:10 PM 1
Christopher Thompson wrote:
> At 04:10 PM 1/24/2002 -0500, Michael Stearne wrote:
>
>> We have a somewhat large read-only table (2.9 million recs). I am
>> wonder if there is a utility that will look at each row of each
>> columns and come up with a summary of the largest field (in character
>
At 04:10 PM 1/24/2002 -0500, Michael Stearne wrote:
>We have a somewhat large read-only table (2.9 million recs). I am wonder
>if there is a utility that will look at each row of each columns and come
>up with a summary of the largest field (in character length) for each
>column. For example,
We have a somewhat large read-only table (2.9 million recs). I am wonder
if there is a utility that will look at each row of each columns and
come up with a summary of the largest field (in character length) for
each column. For example, scan each row's firstname field and report
that the lon
16 matches
Mail list logo