On quinta-feira, 4 de fevereiro de 2016 13:50:24 PST Mitch Kettrick wrote:
> Hi Thiago,
> 
> Thank you for your reply.
> 
> Regarding bandwidth and power consumption, I've never implemented a protocol
> in HW so you may be right that sending extra payload won't have an effect.
> We all come to this with our direct experience and our assumptions and
> often times we're wrong no matter how right we think we are. :)

Hello Mitch

You're right, and at this point we're both speculating. In any case, the power 
itself is not the issue here, so let's table it.

> Regarding the fact that in the end, the Client dictates the interface used,
> I agree.  But, many people who write client applications won't know that if
> the Default is oic.if.baseline, they could save power by adding an oic.if.s
> query to their request; otherwise the Server will be forced to operate in an
> less efficient way.
> 
> If we don't agree on giving Servers the flexibility to set their own Default
> Interface based on the application, one solution, as I said before, is to
> make the Interface that uses the fewest number of bits as the Default
> Interface wherever possible to ensure that if Servers have to send "the
> whole package" it's because the Client explicitly asked for it.

I agree on having multiple interfaces, I agree on making sure that application 
developers know about the more efficient ones and choose to use it whenever 
applicable.

I don't agree that setting the default is a way to achieve the above. At best, 
I think it has zero benefit or impact, since it will never be used in decision-
making. At worst, it's a red herring and confusing, leading to poorly written 
applications failing to communicate when the default in a device is 
unexpected.

(hint: add this to the certification testing; the default should *always* be a 
nonsensical interface that no one should be using)

-- 
Thiago Macieira - thiago.macieira (AT) intel.com
  Software Architect - Intel Open Source Technology Center

Reply via email to