Georg Thome wrote:

> Moin Holger,
> 
> thank you very much for your attention.
> That seem to be good news.
> Anyway the change of the "update statistik" to the "estimate sample" 
> version saves already a lot of hours so that we are hopefully not 
> running into production time.
> Is it possible to say how deep the impact of the percentage 
> value onto 
> the performance result is?
> If result is varying highly does it make sense to use 49% as highest 
> value so that he doesn't use the whole data (as it is 
> described in the 
> documentation with a value of 50% and more) but get the best 
> performance 
> results?

Hi Georg,

the default for the sample rate in MaxDB is 20000 rows if the user 
doesn't explicite define another value at table creation time and 
a lot of our customers have no problems with this default value.

Only on very large tables with more than 1000000 rows this default 
seems to be to small and distinct value estimation gets inaccurate.

I would suggest a estimation rate of 10% this should be a good 
ratio of accuracy and estimation runtime.

Best regards,
Holger
SAP Labs Berlin

-- 
MaxDB Discussion Mailing List
For list archives: http://lists.mysql.com/maxdb
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to