This mode of use is powerful (TiVO for intell) but at least it limits (in principle) use to an actual incident, something determined (legally?) to be actionable.

I have worked on projects which DID NOT presume ubiquitous data like this but which would become unnaturally powerful if the data coverage was near complete, even meta-data.

I'm surprised when people believe that the "time-registered network topology" of all human (or just US Citizens, or just those who have communicated with a non-US Citizen, and those who have communicated with those... you get the picture).

Correlation does not imply Causation... except in nearly every human intuition on the planet. Think of every person you have known (of) who was implicated in a crime (or terrorist attack)... do you want to be implicated by association? OK, maybe you feel that any investigation into your behaviour because a fellow FRIAM member got implicated (doing X, Y, or Z) will be exonerated when it is looked into further.

One of the worries I have from the tools I have helped build and have seen along the way is that at some point, all evidence becomes circumstantial and as the *probability* goes up that you are guilty and exceeds some threshold, it is too easy for many to conflate that with whether you did it or not, or more to the point, people become willing to "take the chance we are wrong", when the stakes go up. Around 9/11, and around horrific murders, serial cases, etc. People often speak as if the risk of a few false positives is acceptable.

It *MAY BE* but I claim that a statistical (especially network-oriented) approach to analysis is extra risky for many reasons. One is that I don't believe we (SFI included) has a sophisticated enough science behind networks to honestly analyze and judge.

The last suite of projects I worked on in this domain (Multi Investment Decision Support Tool, Pre-Incident-Indicator-Analysis, and Faceted Ontologies) were very specifically interested in what could be *inferred* across a time-iterated series of events on a complex network. Among other things, we found that we needed to include Uncertainty Measures (how many tools or methodologies do you know that actually incorporate uncertainty measures and do it well?)

In answer to Owen's question "what can they do with it?", the answer is *plenty*. A corollary question is "what can they do with it which is well understood and honest?" is a more questionable question. My experience is that there is a LOT we can do with it that is *intuitively* compelling but very little which we have formal proofs that transform what would normally be considered "circumstantial" to "proof". In the meantime, gung ho law enforcement (and the general public) will continue to charge forward, cheering abuses of power without realizing that is what they are doing.

- Steve

``I think they're taking the usual approach to large data sets, save it all
(or as much as you can) just in case you find an anomaly you want to
study.''

A short term sample of all traffic could be used analogously to a UAV video
recording. Take any suspect or event and look for any and all signals
leading to them backward in time.   Declare the source of those signals
suspects & find the correlated physical sites & compromise them.  Recurse.

Marcus

--------------------------------------------------------------------
mail2web.com – What can On Demand Business Solutions do for you?
http://link.mail2web.com/Business/SharePoint



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

Reply via email to