Date: Thu, 30 Jun 2005 11:03:58 +1000
From: Raymond <[EMAIL PROTECTED]>
Subject: RE: [LIB] Libretto U-100

At 05:34 PM 29/06/2005 -0700, you wrote:
Date: Tue, 28 Jun 2005 22:25:32 +0400
From: Vitaly Pavlenko <[EMAIL PROTECTED]>
Subject: RE: [LIB] Libretto U-100

> Date: Fri, 24 Jun 2005 12:28:29 -0700 (PDT)
> From: John <[EMAIL PROTECTED]>

> the 723 cpu intel makes runs at
> under 7 watts (if I remember right) and normally the
> libby itself should run at under 1-2watts total?

Well, the system comes with the 60 W power source which is a good estimate of the total power. Of course, it would charge the battery at the same time, so maybe something like 30 W. My lib SS1000 came with 30 W PS but runs fine from third-party 15 W one.

Actually just because the power source can give X watts of power doesn't mean it actually GIVES that all the time - case in point, you plug it into the wall but don't plug the laptop in, it gives exactly 0 watts of power. All that rating means is that's the amount of power that can be drawn whilst guaranteeing the output voltage and dissipated power specifications.

To actually get the amount of power being drawn, you need to measure the current going into the computer (and voltage if you've got particularly poor regulation - switchmodes like all computer PSUs nowadays are generally nice for that). A multimeter in series with the power supply (low voltage side of course!) will give you a good indication (although the purists will argue that you need an oscilloscope since the current drawn by the computer is hardly likely to be dead flat as it has its own switchmode regulator - despite the filtering caps). Also just because it "runs fine" from a 15W one doesn't mean it isn't drawing MORE than 15W - those can stand a fair amount of overloading before showing signs of huffing and puffing and even if the output voltage drops, the switcher in the laptop can often handle being a few volts under spec. Some laptops like my current one even test the adapter when you plug it in and will do things like disable simultaneous charging and running if it detects an adapter that isn't up to spec (not sure how it does it - it could briefly cause a current surge to test the capabilities of the adapter or it may send some sort of data back down the power cable to query some smarts in the adapter).

Maybe a simpler test is to use battery capacity - take the AH rating (or divide the mAh rating by 1000) and multiply it by the battery voltage and you've got the number of rated watt-hours in the battery. Divide this by the runtime you get on the computer and you've got a rough average for the power drawn by the computer during the run. I say rough because the output voltage generally isn't constant (although for lithium it's more so than others) and the capacity rating is rarely accurate firstly because cells deteriorate and secondly because the internal circuitry will probably shut off the power before fully draining the battery (lithiums have a habit of not waking up again after being drained too far).



IMHO Intel is cheating us somehow. They no longer include true dissipated power in the specs but give "thermal design guideline" instead. This matter is beyond my qualification, so please correct me if I am wrong.

The difficulty with providing "true dissipated power" is nowadays processors employ all sorts of weird and wonderful ways of saving power. For instance, the transistors on the chip (FETs of various types) are highly capacitive so they only draw (and dissipate) power when they switch. To save power, the processor is divided into a pile of different sectors. If a sector isn't being used, the clock signal to that sector is switched off - nothing in that sector switches anymore and so it dissipates almost no power. IIRC this is called "clock gating".

Now if you take all the combinations of ways in which a processor may save power, combined with things like clock throttling and the like, you end up with thousands of combinations, each with a different "true dissipated power". I guess Intel *could* publish all of these but then it'd just confuse the consumer even more - which combinations actually get used more? At the end of the day, it really depends on what you do with it.

Thermal design guidelines are an attempt to combine what Intel knows about power dissipation of their chips with what they know about what parts of their chips will get used in which cases to find some sort of nice average. Basically they tell the equipment manufacturers how much power they need to source (electrically) and sink (thermally) in what situations to avoid having the chip overheat. Of course, manufacturers do get this right and wrong to varying degrees - my Dell Inspiron 9300 monster has a Pentium M 2.0 and a GeForce 6800 but even when playing 3D games the base and air vents only get mildly warm. In contrast, we've got a Sharp AL3DU at work (Pentium M 2.0, GeForce 6600), the base of which gets uncomfortably hot even when idling.

</rant> :-)


- Raymond


---


/~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\
|                 | "Does fuzzy logic tickle?"                |
|   ___           | "My HDD has no reverse. How do I backup?" |
|  /__/           +-------------------------------------------|
| /  \ a y b o t  |          [EMAIL PROTECTED]             |
|                 |  Need help? Visit #Windows98 on DALNet!   |
| ICQ: 31756092   |              www.raybot.net               |
\~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~/


Reply via email to