On Thu, Jan 29, 2015 at 4:28 PM, Richer, Justin P. <[email protected]>
wrote:

>  I think you missed my point. I wasn't saying that I *liked*
> canonicalization and normalization but rather that you *need* either it or
> something *equivalent* in order to get the same set of bits out of either
> end. Like you, what I was saying was that you need *something* that makes
> your bit stream stable.
>

Sure but that does not need to be in the signature format. I would prefer
to simply tell people that they have to get this package from A to B with
absolutely no modifications or it won't work.






> I'm actually in favor of the JOSE approach, which is exactly to keep the
> original bits around by protecting them with Base64url over the network so
> that they don't get munched. As an implementor, this is fantastic, as it
> makes my code much simpler in terms of both generation and consumption.
>

I don't like that for the general case though as the main advantage of JSON
is that it is human readable and I can't translate back and forth between
Base64 in my head any more.


> I have the JSON objects where I need them and the raw streams where I need
> them, and I can easily keep them separated with a reasonable assumption
> about them not being confused with each other. For heaven's sake, don't do
> canonicalization. But at the same time, don't assume that a JSON blob is
> going to be kept pristine as a string by any JSON-processing system.
>

That is one of the reasons for insisting on a sharp division between 'that
which is signed' and 'the JSON bit'.


> So we're actually in violent agreement on this point. In my view, JOSE's
> approach of keeping the original bits around is the desirable one by far.
>

I thought you might intend something like that but its not what I think you
said and it gave me an opportunity to use the phrase steaming piles of
stupid :-)
_______________________________________________
jose mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/jose

Reply via email to