Hi,

I just promised to send some additional information about the power
consumption graphs I was just showing.

The "idle" current draw (WiFi tethering on, phone on, cellular idle) in one
of the measurement was about 138mA (in the figures I had this was the
zero-line).

When the cellular interface got active the current draw increased to about
323mA (i.e. in this measurement the cellular interface draws about 185mA,
or accounts for 57% of handset's current total draw when active).

In the three shown artificial scenarios (40s periodic messages) the average
current drawn by gateway's cellular interface were:

1) 20s difference in packets: 87mA

2) 5s difference in packets: 64mA

3) 1s difference in packets: 37mA

Obviously the achieved power savings range from virtually none to very
significant, fully depending on the frequency of periodic messages nodes in
a scenario are emitting and what is the power consumed by local area
network interface. If the local area network would be low-power network
(802.15.4, BT-LE), the relative power consumed by cellular interface would
be significantly higher than in the case of this measurement where local
network was WLAN.

Best regards,

Teemu
--------------------------------------------------------------------
IETF IPv6 working group mailing list
ipv6@ietf.org
Administrative Requests: https://www.ietf.org/mailman/listinfo/ipv6
--------------------------------------------------------------------

Reply via email to