Usually "one order of magnitude more" is about 10 times more.
So, increasing from a range around 8 to around 80 is an
increase in an order of magnitude.

It is more debatable, but not uncommon, for each digit to be
its own order of magnitude: 1-9 / 10-99 / 100-999.

Unfortunately, my "whatis" definition reference,
http://whatis.techtarget.com/definition/0,,sid9_gci527311,00.html

doesn't answer the implied range question either. 
[It does mention: multipliers from septillionths (10)^-24 
to septillions (10)^24, a span of 48 orders of magnitude.] 

I'd say 8 going to 110 is only a single order of magnitude increase;
my own rough range is based on 50% of the next higher, 
so I wouldn't call it a second order of magnitude 
until it was over 400, half of 800.

Now I am also interested in knowing what is 
"the smallest number that is two orders of magnitude larger 
than the original 8 billion estimate" ?

Tom Grey

> Relying on the adage---the only stupid question is the one 
> not asked---I ask
> for an explanation of "an order of magnitude".  I had 
> understood it to mean
> an approximation of an amount associated with whatever 
> subject was under
> discussion.  However, in reading David Levenstam's comment 
> (see related
> excerpt below) it appears that an "order of magnitude" is 
> generally viewed
> as 10's, 100's, 1000's etc.  Responses welcome.
>  
> "All my books remain packed in boxes, so I can't look up the 
> figures, but I 
> seem to recall that the Congressional proponents of Medicare 
> projected an 
> ten-year federal outlay of some $8 billion, as opposed to the 
> annual outlay 
> of $110+ billion now.  I can't conceive of the vast majority 
> of Americans 
> supporting a program that would have cost two orders of 
> magnitude greater 
> than projected."
> 

Reply via email to