Thank you for highlighting this important issue. With recentish New York Times
reports on image recognition software labeling US congressmen as monkeys and
criminal courts using "algorythms" for bail and sentencing decisions, I think
there is cause for severe concern.

All involved with software and computer systems should be aware of such things.

Given the general level of incompetence (see the Boeing 737-MAX airplane 
programmed
to fly into the ground if one external sensor is hit by bird) and the incoming 
era
of robocops firing "only non-lethal" weapons on targets identified "by 
algorithm",
I think there is reason to worry.


K.O.


On Sun, Apr 18, 2021 at 07:24:39AM -0400, LaToya Anderson wrote:
> Data does not remove bias. And one can and should both read the article and
> watch the movie.
> 
> STEM Academy Instructor
> 
> On Sun, Apr 18, 2021, 6:59 AM Nico Kadel-Garcia <nka...@gmail.com> wrote:
> 
> > On Sun, Apr 18, 2021 at 6:22 AM Andrew C Aitchison
> > <and...@aitchison.me.uk> wrote:
> > >
> > > On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:
> > >
> > > > The movie is very strong on "feels", very poor indeed on data.
> > > >
> > > > A much better article, with far less "feels"
> > >           ^^^^^^
> > > Is that a deliberate example of the bias in the video ?
> >
> > No, it's an evaluation of the video and a pointer to a much more
> > succinct, more specific article.
> >

-- 
Konstantin Olchanski
Data Acquisition Systems: The Bytes Must Flow!
Email: olchansk-at-triumf-dot-ca
Snail mail: 4004 Wesbrook Mall, TRIUMF, Vancouver, B.C., V6T 2A3, Canada

Reply via email to