Hiya,

First, a caveat - I don't know and am guessing how these
systems are built (if you have pointers, those'd be good
to see), but nonetheless...

On 11/02/2021 01:57, Jack Visoky wrote:
Hi Stephen,

Thanks for elaborating.

From the general standpoint I think we can both agree an attacker can
more easily cause more harm if packets can be modified than not
modified, especially in the use case examples.

I don't think that's the interesting question here. We're
only discussing the differences between ciphersuites that
do, or do not, provide confidentiality.

Now of course I can't
and don't claim in all cases of a robotic arm (or any other general
example) an attacker can't cause damage by simply reading data. For
the example you give, it sounds like "changes to the controller"
involve things like packet drops. The attacker can't simply connect
to the controller and reconfigure it due to the TLS authentication,

Ah. There I don't agree. Attackers don't have to play by
our rules. They may have a supply-chain or other attack on
some of the upstream controller components and/or control an
application layer at some point. If they can also see the
fine-grained effects of their changes at the downstream
arm-movement, then I'm arguing that they may be in a much
better position to effect a persistent and agile attack that
results in paint-jobs that pass QA checks, but that later
cause expensive recalls. If OTOH, the active attack code
can't benefit from "visibility" of the fine-grained effects
of its changes on the paint arms, then my assertion is that
QA and other checks should have an easier time being
effective in detecting the dodgy outcome of the borked
painting process.

and can't modify packets due to the same. I'd claim that in general
the "packet dropping attack" ranges from marginally more effective to
no difference by the attacker knowing the position/speed. In general
I'd say it wouldn't be easy to change centrifuge spin rate if all
connections have SHA-256 HMACS on their data and use mutual
authentication via certificates and/or PSKs. Again a generality but
usually these systems need to be designed such that network noise,
unexpected power cycle, etc. can be handled without causing major
harm (although conversely any downtime is usually money lost, but
again you don't need exact position data to cause that). We can
certainly debate the edges on this, but I'd still say that a system
like this makes a giant improvement moving from no communication
security to TLS without encryption, and doesn't gain anything close
to that benefit by then moving to encryption.

You'll not be surprised that I'm unconvinced:-) Aside from my
almost movie-plot attack (but recalling that stuxnet really dud happen), I'd be very surprised if specific robot arms
were programmed to use different ciphersuites according to
the actual situational risks involved. And if they could be,
it'd still be easier to just use the recommended ciphersuites
and not bother with being such an awkward child and omitting
confidentiality. Were I procuring equipment, I'd much prefer
to buy robots that had the same transport layer security as
all the other parts of my overall manufacturing system and
not have to do the special additional analysis caused by odd
stacks that behave differently in subtle ways. In such a case
I'd really want a massive payoff for such subtle divergences,
before I'd even think about ok'ing 'em. For the draft in
question, I honestly cannot see even the possibility of such
a massive payoff.

All that said I am
absolutely not against encryption, and I think it should be used as
much as possible even in industrial/IoT cases. I would just like to
recognize that there are some situations where it isn't needed.

I consider that last assertion unproven and likely false. The
game here is defining (or not defining) ciphersuites that are
useful in many, many scenarios. My conclusion is that we all
lose if we do as this draft proposes and define ciphersuites
that can be dangerous in unexpected and hard to analyse ways,
should they get widely deployed. (And I hope we've all
learned that undeploying these things is a big pile of
painful work, to put in very nicely:-)

Cheers,
S.


Thanks,

--Jack

-----Original Message----- From: Stephen Farrell
<stephen.farr...@cs.tcd.ie> Sent: Wednesday, February 10, 2021 6:46
PM To: Jack Visoky <jmvis...@ra.rockwell.com>; John Mattsson
<john.mattsson=40ericsson....@dmarc.ietf.org>; TLS@ietf.org Subject:
Re: [TLS] EXTERNAL: TLS 1.3 Authentication and Integrity only Cipher
Suites


Hiya,

On 10/02/2021 22:56, Jack Visoky wrote:
Hi Stephen,

I believe the case of the robotic arm which paints cars would be a reasonable example of a case where confidentiality of data is not required. I fully admit that it's not generalizable to all robotic arms painting all cars, but I do believe it holds for many of
these cases. At the risk of treading this ground again, could you
elaborate on your thoughts around this use case?

Knowing the positioning of an arm from afar, via snooping on the n/w,
could allow an attacker to fine-tune an attack aimed at e.g. reducing
paint quality, forcing a recall, via changes to a controller (i.e.
not one of the TLS endpoints). That's not very different to changing
the spin-rate of a centrifuge, an attack that we know was mounted,
some time ago.

It's not hard to come up with arguments against these use cases. I
don't claim that those are winning arguments, and I don't expect
those who desire "visibility" will be won over, but I do claim these
are valid arguments for wanting the "best" transport layer security
we can get turned on everywhere all the time, to the extent possible.
I for one believe experience so far has shown that to be the wisest
approach in general.

S.


Thanks,

--Jack

-----Original Message----- From: TLS <tls-boun...@ietf.org> On
Behalf Of Stephen Farrell Sent: Wednesday, February 10, 2021 7:08
AM To: John Mattsson
<john.mattsson=40ericsson....@dmarc.ietf.org>; TLS@ietf.org
Subject: Re: [TLS] EXTERNAL: TLS 1.3 Authentication and Integrity
only Cipher Suites


Hiya,

I realise it's not proposed as a wg document, but fwiw, I think
John is quite correct below. The only additional point I'd add is
that I've seen cases over the years where the fact that
confidentiality really *is* required only became clear when people
actually considered how to attack systems. I'd be happy to bet
beers that will be the case with some examples mentioned in the
draft, esp the wandering robotic arm.

This seems like a not that well done write-up of a bad idea to me.

S.

On 10/02/2021 09:14, John Mattsson wrote:
Hi,

- The draft has a lot of claims regarding benefits:

"strong requirement for low latency." "minimize the
cryptographic algorithms are prioritized" "important for latency
to be very low." "pay more for a sensor with encryption
capability" "come with a small runtime memory footprint and
reduced processing power, the need to minimize" the number of
cryptographic algorithms used is prioritized."

I don't think this draft should be published as long as it gives
the idea that sacrificing confidentiality has significant
benefits for latency, memory, processing power, and cost. This is
in general not the case.

The two cipher suites TLS_SHA256_SHA256 and TLS_SHA384_SHA384
defined by the draft causes much more message expansion (32 and
48 bytes tags instead of 16 or 8 bytes) than the already
registered cipher suites for TLS 1.3. In many IoT radio systems
with small frames this will leads to significantly increased
latency. I think that needs to be mentioned.


- The draft has ridiculous amount of sentences saying that confidentiality is not strictly needed.

"do not require confidentiality" "privacy is not strictly
needed" "no strong requirement for confidentiality" "no
requirement to encrypt messages" "no need for confidentiality"
"reduced need for confidentiality" "confidentiality requirements
are relaxed" "do not require confidential communications" "does
not convey private information" "without requiring the
communication to/from the robotic arm to be encrypted" "doesn't
grant the attacker information that can be exploited" "no
confidentiality requirements"

It would be more honest if the draft simply stated that "the are
use cases that require visibility". If visibility is not a
requirement for the use cases, I think IETF could help you to
standardize SHA-2 only cipher suites offering confidentiality.


- The draft mentions that the security considerations regarding confidentiality and privacy does not hold. The draft does not
mention that it breaks one of the stated security properties of
TLS 1.3, namely "Protection of endpoint identities". This is
actually quite problematic. EAP-TLS 1.3 relied on this stated TLS
1.3 property to be true.

John

_______________________________________________ TLS mailing list TLS@ietf.org https://www.ietf.org/mailman/listinfo/tls

Attachment: OpenPGP_0x5AB2FAF17B172BEA.asc
Description: application/pgp-keys

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

_______________________________________________
TLS mailing list
TLS@ietf.org
https://www.ietf.org/mailman/listinfo/tls

Reply via email to