The vast majority of the cost of using an index is due to reading the data blocks in the table in a random order, rathern than sequentially, right? Rather than making estimates of the selectivity of an index when you search for 'W%', why not actually start the index lookup and count how many you get? If you know you want to do a table scan if you have more than, say, 500 rows that match 'W%', you'd only have to read a few index pages to determine whether or not there are more than 500 rows, right?
Or am I on crack here? cjs -- Curt Sampson <[EMAIL PROTECTED]> +81 90 7737 2974 http://www.netbsd.org Don't you know, in this new Dark Age, we're all light. --XTC ---------------------------(end of broadcast)--------------------------- TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]