Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-28 Thread Nico Kadel-Garcia
On Tue, Apr 27, 2021 at 12:49 PM Queen, Steven Z. (GSFC-5910)
<0eed424118a3-dmarc-requ...@listserv.fnal.gov> wrote:
>
> I don't think this list is an appropriate place for political discussions.  
> Hopefully an administrator will intervene.
> If this continues, I will unsubscribe.

Sorry about that, I'll shush.


Re: Code bias video, watch it ASAP

2021-04-27 Thread Yasha Karant

You state:

If you are just doing it for the money, please go into investing, not 
into engineering.


I fully agree.  However, in a monetized or equivalent socio-economic 
system (e.g., whether it is the neo-liberal USA, nomenklaturist former 
USSR, or neo-fascist economy of the current PRC), engineering has no 
control over how the monetizers behave unless, following a professional 
code of conduct and ethics, and assuming that there is no compulsion 
(imprisonment, execution, etc.), the only choice is to withhold work. I 
do not know the details of the system you engineered (and presumably 
produced) that you describe. However, if the corporation (as I presume 
you took that legal ploy to avoid unlimited personal liability) that has 
the intellectual property sells that IP to a profiteer investor, then 
your product will both be lessened in quality and raised in price, and 
outsourced to the lowest cost of production (cost efficiency). If you 
personally own the IP and can afford IP infringement legal battles, then 
the copycat infringer with lower costs -- and lower quality -- perhaps 
can be stopped from obtaining market dominance. The argument that 
"computer software" cannot kill people, even monetary accounting 
software, is demonstrably false, as you point out below for real-time 
"control" systems.  Whilst I agree with making the "dancing bearware" a 
criminal offense on the part of the profiteer management that forced the 
production of such defective and dangerous products (all software has 
defects, but all defects need to have fail-safes and overrides, and 
there should be enough testing -- with edge and impossible conditions -- 
to detect "all" the dangerous defects), one must recall that most 
systems of government are the best that "money" can buy, historically 
ending in societal collapse.  To the best of my knowledge, there was no 
criminal prosecution in the USA of any senior management that made the 
Ford Pinto decisions.


On 4/27/21 3:51 PM, Keith Lofstrom wrote:

On Tue, Apr 27, 2021 at 09:04:53AM -0400, Nico Kadel-Garcia wrote:

The movie is, itself, profoundly biased. It didn't explore at all why
a public housing project might benefit from cameras on the door of a
densely populated building with numerous poor, old, or unhealthy
tenants.


On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson  
wrote:

Data does not remove bias. And one can and should both read the article and 
watch the movie.


I imagine the camera was helpful for young white men
entering the building.  But how does that help the old
non-white women who are locked out or their apartments
because the software fails 30% of the time for them?

An automated camera makes sense if software is perfect,
but the point of the film and the paper is that the
software is not perfect.  A automated camera should
work BETTER than adept (and unbiased) human being paying
attention to a monitor, not just CHEAPER.

"Dancing bearware" that only pretends to do the job should
result in prosecution and hefty penalties for the software
designers and decision makers if their "cost saving"
replacement of trained human security guards results in
a crime ... letting a criminal in or locking a tenant
out, to be robbed on the doorstep.

I too was bothered by the film's seeming "lefty bias",
but "my side" is *human achievement* ... engaging all
8 billion of us.  Leaving people out is economically
suboptimal, but most organizations are insensitive to
the costs they impose outside their organization.  We
create (often bad) laws to internalize those costs so the
organizations MUST pay attention.  Sadly, laws usually
just make organizations pay attention to loopholes.

If the automated cameras are redesigned to do their job
perfectly, I would love that.  If diligent security
guards are trained and employed for the task, I'm for
that as well.

What I am not for is replacing quality human effort with
slapdash "cost cutting", which often means "job cutting".
In this case, putting human security guards on the dole,
or not hiring enough competent software designers to
properly design and properly TEST recognition software,
that works for everyone, not just software designers
and ethnically similar product purchasers.

I imagine a room full of $10/hour Chinese programmers
designing this software for Chinese customers.  I bet
their software would do a good job recognizing blacks
in the US if there were enough blacks in China to test
their software with.  Offshoring has its costs as well,
and the point of the film is that the costs are imposed
on those least able to pay them.

I also imagine devolving software purchasing decisions
downwards to the people who are affected by them.  In
my ideal world, some of those tenants would be involved
in testing and selecting the software.

Or training tenants to look at security cameras as a
part-time job; perhaps for a rent reduction.  Software
might be used to insure that those "informal employees"
are doing the task the

Re: Code bias video, watch it ASAP

2021-04-27 Thread Keith Lofstrom
On Tue, Apr 27, 2021 at 09:04:53AM -0400, Nico Kadel-Garcia wrote:
> The movie is, itself, profoundly biased. It didn't explore at all why
> a public housing project might benefit from cameras on the door of a
> densely populated building with numerous poor, old, or unhealthy
> tenants. 

On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson  
wrote:
> Data does not remove bias. And one can and should both read the article and 
> watch the movie.

I imagine the camera was helpful for young white men
entering the building.  But how does that help the old
non-white women who are locked out or their apartments
because the software fails 30% of the time for them?

An automated camera makes sense if software is perfect,
but the point of the film and the paper is that the
software is not perfect.  A automated camera should
work BETTER than adept (and unbiased) human being paying
attention to a monitor, not just CHEAPER.

"Dancing bearware" that only pretends to do the job should
result in prosecution and hefty penalties for the software
designers and decision makers if their "cost saving"
replacement of trained human security guards results in 
a crime ... letting a criminal in or locking a tenant
out, to be robbed on the doorstep.

I too was bothered by the film's seeming "lefty bias",
but "my side" is *human achievement* ... engaging all
8 billion of us.  Leaving people out is economically
suboptimal, but most organizations are insensitive to
the costs they impose outside their organization.  We
create (often bad) laws to internalize those costs so the
organizations MUST pay attention.  Sadly, laws usually
just make organizations pay attention to loopholes.

If the automated cameras are redesigned to do their job
perfectly, I would love that.  If diligent security
guards are trained and employed for the task, I'm for
that as well. 

What I am not for is replacing quality human effort with
slapdash "cost cutting", which often means "job cutting".
In this case, putting human security guards on the dole,
or not hiring enough competent software designers to
properly design and properly TEST recognition software,
that works for everyone, not just software designers
and ethnically similar product purchasers.

I imagine a room full of $10/hour Chinese programmers 
designing this software for Chinese customers.  I bet
their software would do a good job recognizing blacks
in the US if there were enough blacks in China to test
their software with.  Offshoring has its costs as well,
and the point of the film is that the costs are imposed
on those least able to pay them.  

I also imagine devolving software purchasing decisions
downwards to the people who are affected by them.  In
my ideal world, some of those tenants would be involved
in testing and selecting the software. 

Or training tenants to look at security cameras as a
part-time job; perhaps for a rent reduction.  Software
might be used to insure that those "informal employees"
are doing the task they are paid for, but that could
have bias as well.  

Quality is hard work.  Nobody is perfect, and some folks
are quite imperfect - thugs in Armani suits.  Automating
imperfection is the opposite of quality, while designing
to compensate for imperfection is the path to continuous
quality improvement.

But hey, I'm a chip designer.  I invented a circuit that
is used to ultra-cheaply identify individual electronic
devices, WITHOUT identifying the individuals using them.
I wrote A LOT of Linux software to test and improve those
designs; we got the failure rate down below 30 parts per
million, and we designed fallbacks for the unhappy 30.
The customers would have accepted worse - but I would not.

At the end of the day, our professional satisfaction rests
on what we have accomplished, not on what we are paid to
do it.  If you are just doing it for the money, please go
into investing, not into engineering.

Keith

-- 
Keith Lofstrom  kei...@keithl.com


Re: Code bias video, watch it ASAP

2021-04-27 Thread Yasha Karant
Which is the appropriate forum if not one to which engineers and 
technologists who develop the technology (often under requirement from 
profiteers) do subscribe?  Most of the fora upon which these matters are 
discussed are not read by the majority of engineers and technologists.


On 4/27/21 11:57 AM, LaToya Anderson wrote:
I pray that folks do continue this discussion on the appropriate forum. 
It's the only way the the tech space is going to improve.


STEM Academy Instructor

On Tue, Apr 27, 2021, 2:26 PM Patrick Riehecky > wrote:


Hello friends,

We typically like to let some level of off topic discussion run for a
little while.  It tends to fizzle out after a short while and allows us
to avoid strong moderation.  Calls for moderation on this thread have
prompted this response.

The Scientific Linux User's list is a place for help with or questions
about Scientific Linux.

Fermilab Statement of Diversity : https://diversity.fnal.gov/


We do not wish to minimize or dismiss conversations about social
change, but rather direct them to a place where folks are working on
those topics.



Re: Code bias video, watch it ASAP

2021-04-27 Thread LaToya Anderson
I pray that folks do continue this discussion on the appropriate forum.
It's the only way the the tech space is going to improve.

STEM Academy Instructor

On Tue, Apr 27, 2021, 2:26 PM Patrick Riehecky  wrote:

> Hello friends,
>
> We typically like to let some level of off topic discussion run for a
> little while.  It tends to fizzle out after a short while and allows us
> to avoid strong moderation.  Calls for moderation on this thread have
> prompted this response.
>
> The Scientific Linux User's list is a place for help with or questions
> about Scientific Linux.
>
> Fermilab Statement of Diversity : https://diversity.fnal.gov/
>
> We do not wish to minimize or dismiss conversations about social
> change, but rather direct them to a place where folks are working on
> those topics.
>


RE: Code bias video, watch it ASAP

2021-04-27 Thread Patrick Riehecky
Hello friends,

We typically like to let some level of off topic discussion run for a
little while.  It tends to fizzle out after a short while and allows us
to avoid strong moderation.  Calls for moderation on this thread have
prompted this response.

The Scientific Linux User's list is a place for help with or questions
about Scientific Linux.

Fermilab Statement of Diversity : https://diversity.fnal.gov/

We do not wish to minimize or dismiss conversations about social
change, but rather direct them to a place where folks are working on
those topics.


Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Yasha Karant

Fermilab is not a government facility, but it is so funded:

Fermi National Accelerator Laboratory
Managed by Fermi Research Alliance, LLC
for the U.S. Department of Energy Office of Science

and although this listserve is on a USA Federal government network, it 
is not a classified (clandestine, DoD, security clearance, etc.) 
network.  The Hatch act applies if one discusses specific political 
persons or parties (e.g., a specific USA former president, a specific 
agenda such as voter suppression by a specific political party), but the 
Hatch Act does not apply to societal issues per se. Under those other 
governments that I mentioned, their "hatch acts" do apply -- to 
everything and by everybody at all times and places.


On 4/27/21 10:52 AM, Queen, Steven Z. (GSFC-5910) wrote:
The Hatch Act makes it illegal for federal workers to discuss politics.  
This is a government list.



*From:* owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Yasha 
Karant 

*Sent:* Tuesday, April 27, 2021 1:18 PM
*To:* Mailing list for Scientific Linux users worldwide 


*Subject:* Re: [EXTERNAL] Re: Code bias video, watch it ASAP
For those of us one this list who are ACM members, I quote:

https://urldefense.proofpoint.com/v2/url?u=https-3A__gcc02.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Furldefense.proofpoint.com-252Fv2-252Furl-253Fu-253Dhttps-2D3A-5F-5Fwww.acm.org-5Fcode-2D2Dof-2D2Dethics-2526d-253DDwID-2Dg-2526c-253DgRgGjJ3BkIsb5y6s49QqsA-2526r-253Dgd8BzeSQcySVxr0gDWSEbN-2DP-2DpgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A-2526m-253DrvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE-2526s-253DwFYH-5FRV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU-2526e-253D-26amp-3Bdata-3D04-257C01-257Csteven.z.queen-2540nasa.gov-257Ca17ab3bd8e1741159c6f08d909a08464-257C7005d45845be48ae8140d43da96dd17b-257C0-257C0-257C637551407289493580-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0-253D-257C1000-26amp-3Bsdata-3DNmsOqa-252FLPqEGv893Voz5APCahCSgC0q-252FTIJZc1Mrrss-253D-26amp-3Breserved-3D0&d=DwIG-g&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=k2PdSiBMUaK4orfwXtDf4GprNbKelOnlIDBXkpmcGU0&s=Mb2EKQDcSFjBPPxpEmbNCvoTK31-pLXom4K-7VjJZYU&e=  
<https://urldefense.proofpoint.com/v2/url?u=https-3A__gcc02.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Furldefense.proofpoint.com-252Fv2-252Furl-253Fu-253Dhttps-2D3A-5F-5Fwww.acm.org-5Fcode-2D2Dof-2D2Dethics-2526d-253DDwID-2Dg-2526c-253DgRgGjJ3BkIsb5y6s49QqsA-2526r-253Dgd8BzeSQcySVxr0gDWSEbN-2DP-2DpgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A-2526m-253DrvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE-2526s-253DwFYH-5FRV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU-2526e-253D-26amp-3Bdata-3D04-257C01-257Csteven.z.queen-2540nasa.gov-257Ca17ab3bd8e1741159c6f08d909a08464-257C7005d45845be48ae8140d43da96dd17b-257C0-257C0-257C637551407289493580-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0-253D-257C1000-26amp-3Bsdata-3DNmsOqa-252FLPqEGv893Voz5APCahCSgC0q-252FTIJZc1Mrrss-253D-26amp-3Breserved-3D0&d=DwMFAw&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=9vN5yrwyJ43ZZVrBXc3heOGC341KYj6SU6iWv8c9l7w&s=B9eHffkdKCWVfMUYH6B6rLLGCPehok6sjus7c5um2ic&e=> 



A computing professional should...
1.1 Contribute to society and to human well-being, acknowledging that
all people are stakeholders in computing.

Similar statements exist in the ethics codes of other
computing/informatics professional societies.

Despite that fact that others have indicated that this, or other
societal issues, are inappropriate for this list, and have threatened to
unsubscribe, some discussion of this matter is appropriate and correct
within the ACM code (and most other codes).  This list does not conform
to the codes from the former Third Reich, former USSR, present PRC,
etc., for which any discussion of societal failings of the in-power
control group persons is prohibited and often punishable by the
government controlled by the relevant in-power group.

On 4/27/21 9:49 AM, Queen, Steven Z. (GSFC-5910) wrote:
I don't think this list is an appropriate place for political 
discussions.  Hopefully an administrator will intervene.

If this continues, I will unsubscribe.


*From:* owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Nico 
Kadel-Garcia 

*Sent:* Tuesday, April 27, 2021 9:04 AM
*To:* LaToya Anderson 
*Cc:* Andrew C Aitchison ; Keith Lofstrom 
; Mailing list for Scientific Linux users worldwide 


*Subject:* [EXTERNAL] Re: Code bias video, watch it ASAP
On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
 wrote:


Data does not remove bias. And one can and should both read the article and 
watch the movie.

STEM Academy Ins

Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Jon Pruente
No, it doesn't. It prohibits Federal employees from discussing partisan
politics while on the job or on Federal property, or with someone they have
an influence on at work. It does not bar them from participating in
political speech on mailing lists.

https://urldefense.proofpoint.com/v2/url?u=https-3A__www.dla.mil_AboutDLA_News_NewsArticleView_Article_954090_a-2Dlittle-2Dless-2Dconversation-2Dthe-2Dhatch-2Dact-2Dand-2Ddiscussions-2Dwith-2Dcoworkers_&d=DwIFaQ&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=xbYvfl9s8EIrHlcpnW7mxFQbTyrUQ26wVxji0tDjk30&s=EbmDUA0vtSj7CvFbNCKbV-JI5sYihI_sNyOmlt_F5EQ&e=
 

On Tue, Apr 27, 2021 at 12:52 PM Queen, Steven Z. (GSFC-5910) <
0eed424118a3-dmarc-requ...@listserv.fnal.gov> wrote:

> The Hatch Act makes it illegal for federal workers to discuss politics.
> This is a government list.
>
> --
> *From:* owner-scientific-linux-us...@listserv.fnal.gov <
> owner-scientific-linux-us...@listserv.fnal.gov> on behalf of Yasha Karant
> 
> *Sent:* Tuesday, April 27, 2021 1:18 PM
> *To:* Mailing list for Scientific Linux users worldwide <
> scientific-linux-us...@listserv.fnal.gov>
> *Subject:* Re: [EXTERNAL] Re: Code bias video, watch it ASAP
>
> For those of us one this list who are ACM members, I quote:
>
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__gcc02.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Furldefense.proofpoint.com-252Fv2-252Furl-253Fu-253Dhttps-2D3A-5F-5Fwww.acm.org-5Fcode-2D2Dof-2D2Dethics-2526d-253DDwID-2Dg-2526c-253DgRgGjJ3BkIsb5y6s49QqsA-2526r-253Dgd8BzeSQcySVxr0gDWSEbN-2DP-2DpgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A-2526m-253DrvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE-2526s-253DwFYH-5FRV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU-2526e-253D-26amp-3Bdata-3D04-257C01-257Csteven.z.queen-2540nasa.gov-257Ca17ab3bd8e1741159c6f08d909a08464-257C7005d45845be48ae8140d43da96dd17b-257C0-257C0-257C637551407289493580-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0-253D-257C1000-26amp-3Bsdata-3DNmsOqa-252FLPqEGv893Voz5APCahCSgC0q-252FTIJZc1Mrrss-253D-26amp-3Breserved-3D0&d=DwIFaQ&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=xbYvfl9s8EIrHlcpnW7mxFQbTyrUQ26wVxji0tDjk30&s=YPXsUVdxMwD7PZmOzc00Ml1jd6BlIupifakbx1FIBYA&e=
>  
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__gcc02.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Furldefense.proofpoint.com-252Fv2-252Furl-253Fu-253Dhttps-2D3A-5F-5Fwww.acm.org-5Fcode-2D2Dof-2D2Dethics-2526d-253DDwID-2Dg-2526c-253DgRgGjJ3BkIsb5y6s49QqsA-2526r-253Dgd8BzeSQcySVxr0gDWSEbN-2DP-2DpgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A-2526m-253DrvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE-2526s-253DwFYH-5FRV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU-2526e-253D-26amp-3Bdata-3D04-257C01-257Csteven.z.queen-2540nasa.gov-257Ca17ab3bd8e1741159c6f08d909a08464-257C7005d45845be48ae8140d43da96dd17b-257C0-257C0-257C637551407289493580-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0-253D-257C1000-26amp-3Bsdata-3DNmsOqa-252FLPqEGv893Voz5APCahCSgC0q-252FTIJZc1Mrrss-253D-26amp-3Breserved-3D0&d=DwMFAw&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=9vN5yrwyJ43ZZVrBXc3heOGC341KYj6SU6iWv8c9l7w&s=B9eHffkdKCWVfMUYH6B6rLLGCPehok6sjus7c5um2ic&e=>
>
> A computing professional should...
> 1.1 Contribute to society and to human well-being, acknowledging that
> all people are stakeholders in computing.
>
> Similar statements exist in the ethics codes of other
> computing/informatics professional societies.
>
> Despite that fact that others have indicated that this, or other
> societal issues, are inappropriate for this list, and have threatened to
> unsubscribe, some discussion of this matter is appropriate and correct
> within the ACM code (and most other codes).  This list does not conform
> to the codes from the former Third Reich, former USSR, present PRC,
> etc., for which any discussion of societal failings of the in-power
> control group persons is prohibited and often punishable by the
> government controlled by the relevant in-power group.
>
> On 4/27/21 9:49 AM, Queen, Steven Z. (GSFC-5910) wrote:
> > I don't think this list is an appropriate place for political
> > discussions.  Hopefully an administrator will intervene.
> > If this continues, I will unsubscribe.
> >
> > 
> > *From:* owner-scientific-linux-us...@listserv.fnal.gov
> >  on behalf of Nico
> > Kadel-Garcia 
> > *Sent:* Tuesday, April 27, 2021 9:04 AM
> > *To:* LaToya Anderson 
> > 

Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Konstantin Olchanski
On Tue, Apr 27, 2021 at 04:49:00PM +, Queen, Steven Z. (GSFC-5910) wrote:
>
> I don't think this list is an appropriate place for political discussions.
>

I agree, I think the OP should not have brought this subject on this list.

>
> Hopefully an administrator will intervene.
> If this continues, I will unsubscribe.
> 

Unfortunately, technical, policy and political issues mix together and
there is no avoiding that. Ignoring them is dangerous, they have a way
of catching up to you in whatever cave or monastery or retreat you choose
to hide in.

K.O.


> 
> From: owner-scientific-linux-us...@listserv.fnal.gov 
>  on behalf of Nico 
> Kadel-Garcia 
> Sent: Tuesday, April 27, 2021 9:04 AM
> To: LaToya Anderson 
> Cc: Andrew C Aitchison ; Keith Lofstrom 
> ; Mailing list for Scientific Linux users worldwide 
> 
> Subject: [EXTERNAL] Re: Code bias video, watch it ASAP
> 
> On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
>  wrote:
> >
> > Data does not remove bias. And one can and should both read the article and 
> > watch the movie.
> >
> > STEM Academy Instructor
> 
> Data rather than mere exposition helps prevent bias. How do you refute
> or counter unfair bias except with data?
> 
> The movie is, itself, profoundly biased. It didn't explore at all why
> a public housing project might benefit from cameras on the door of a
> densely populated building with numerous poor, old, or unhealthy
> tenants. The movie was an icon of "Critical Theory", portraying the
> attempt to use science and engineering for social problems as a plot
> against the oppressed.
> 
> I've lived in scary neighborhoods of London. London accepts and
> expects a degree of CCTV monitoring that is outrageous to Americans.
> Sadly, citizens can't *get* the videos when a crime occurs, and
> photographic evidence can be misused against the innocent. Been there,
> done that, watched a London parking cop frame the photos they took to
> document a parking ticket, really ticked him off when I very obviously
> took photos at angles that showed the car was, in fact parked near a
> sign that gave permission and curb markings that matched.

-- 
Konstantin Olchanski
Data Acquisition Systems: The Bytes Must Flow!
Email: olchansk-at-triumf-dot-ca
Snail mail: 4004 Wesbrook Mall, TRIUMF, Vancouver, B.C., V6T 2A3, Canada


Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Queen, Steven Z. (GSFC-5910)
The Hatch Act makes it illegal for federal workers to discuss politics.  This 
is a government list.


From: owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Yasha Karant 

Sent: Tuesday, April 27, 2021 1:18 PM
To: Mailing list for Scientific Linux users worldwide 

Subject: Re: [EXTERNAL] Re: Code bias video, watch it ASAP

For those of us one this list who are ACM members, I quote:

https://urldefense.proofpoint.com/v2/url?u=https-3A__gcc02.safelinks.protection.outlook.com_-3Furl-3Dhttps-253A-252F-252Furldefense.proofpoint.com-252Fv2-252Furl-253Fu-253Dhttps-2D3A-5F-5Fwww.acm.org-5Fcode-2D2Dof-2D2Dethics-2526d-253DDwID-2Dg-2526c-253DgRgGjJ3BkIsb5y6s49QqsA-2526r-253Dgd8BzeSQcySVxr0gDWSEbN-2DP-2DpgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A-2526m-253DrvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE-2526s-253DwFYH-5FRV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU-2526e-253D-26amp-3Bdata-3D04-257C01-257Csteven.z.queen-2540nasa.gov-257Ca17ab3bd8e1741159c6f08d909a08464-257C7005d45845be48ae8140d43da96dd17b-257C0-257C0-257C637551407289493580-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0-253D-257C1000-26amp-3Bsdata-3DNmsOqa-252FLPqEGv893Voz5APCahCSgC0q-252FTIJZc1Mrrss-253D-26amp-3Breserved-3D0&d=DwIFAw&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=9vN5yrwyJ43ZZVrBXc3heOGC341KYj6SU6iWv8c9l7w&s=B9eHffkdKCWVfMUYH6B6rLLGCPehok6sjus7c5um2ic&e=
 

A computing professional should...
1.1 Contribute to society and to human well-being, acknowledging that
all people are stakeholders in computing.

Similar statements exist in the ethics codes of other
computing/informatics professional societies.

Despite that fact that others have indicated that this, or other
societal issues, are inappropriate for this list, and have threatened to
unsubscribe, some discussion of this matter is appropriate and correct
within the ACM code (and most other codes).  This list does not conform
to the codes from the former Third Reich, former USSR, present PRC,
etc., for which any discussion of societal failings of the in-power
control group persons is prohibited and often punishable by the
government controlled by the relevant in-power group.

On 4/27/21 9:49 AM, Queen, Steven Z. (GSFC-5910) wrote:
> I don't think this list is an appropriate place for political
> discussions.  Hopefully an administrator will intervene.
> If this continues, I will unsubscribe.
>
> 
> *From:* owner-scientific-linux-us...@listserv.fnal.gov
>  on behalf of Nico
> Kadel-Garcia 
> *Sent:* Tuesday, April 27, 2021 9:04 AM
> *To:* LaToya Anderson 
> *Cc:* Andrew C Aitchison ; Keith Lofstrom
> ; Mailing list for Scientific Linux users worldwide
> 
> *Subject:* [EXTERNAL] Re: Code bias video, watch it ASAP
> On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
>  wrote:
>>
>> Data does not remove bias. And one can and should both read the article and 
>> watch the movie.
>>
>> STEM Academy Instructor
>
> Data rather than mere exposition helps prevent bias. How do you refute
> or counter unfair bias except with data?
>
> The movie is, itself, profoundly biased. It didn't explore at all why
> a public housing project might benefit from cameras on the door of a
> densely populated building with numerous poor, old, or unhealthy
> tenants. The movie was an icon of "Critical Theory", portraying the
> attempt to use science and engineering for social problems as a plot
> against the oppressed.
>
> I've lived in scary neighborhoods of London. London accepts and
> expects a degree of CCTV monitoring that is outrageous to Americans.
> Sadly, citizens can't *get* the videos when a crime occurs, and
> photographic evidence can be misused against the innocent. Been there,
> done that, watched a London parking cop frame the photos they took to
> document a parking ticket, really ticked him off when I very obviously
> took photos at angles that showed the car was, in fact parked near a
> sign that gave permission and curb markings that matched.


Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Yasha Karant

For those of us one this list who are ACM members, I quote:

https://urldefense.proofpoint.com/v2/url?u=https-3A__www.acm.org_code-2Dof-2Dethics&d=DwID-g&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=rvvKgjCdyxB3pmR8XX9sSUyDJj91jQ8He2RzBGLFsWE&s=wFYH_RV7rQLOGae55WLKa5Ve3lYcqfdera6gdLL3ajU&e= 


A computing professional should...
1.1 Contribute to society and to human well-being, acknowledging that 
all people are stakeholders in computing.


Similar statements exist in the ethics codes of other 
computing/informatics professional societies.


Despite that fact that others have indicated that this, or other 
societal issues, are inappropriate for this list, and have threatened to 
unsubscribe, some discussion of this matter is appropriate and correct 
within the ACM code (and most other codes).  This list does not conform 
to the codes from the former Third Reich, former USSR, present PRC, 
etc., for which any discussion of societal failings of the in-power 
control group persons is prohibited and often punishable by the 
government controlled by the relevant in-power group.


On 4/27/21 9:49 AM, Queen, Steven Z. (GSFC-5910) wrote:
I don't think this list is an appropriate place for political 
discussions.  Hopefully an administrator will intervene.

If this continues, I will unsubscribe.


*From:* owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Nico 
Kadel-Garcia 

*Sent:* Tuesday, April 27, 2021 9:04 AM
*To:* LaToya Anderson 
*Cc:* Andrew C Aitchison ; Keith Lofstrom 
; Mailing list for Scientific Linux users worldwide 


*Subject:* [EXTERNAL] Re: Code bias video, watch it ASAP
On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
 wrote:


Data does not remove bias. And one can and should both read the article and 
watch the movie.

STEM Academy Instructor


Data rather than mere exposition helps prevent bias. How do you refute
or counter unfair bias except with data?

The movie is, itself, profoundly biased. It didn't explore at all why
a public housing project might benefit from cameras on the door of a
densely populated building with numerous poor, old, or unhealthy
tenants. The movie was an icon of "Critical Theory", portraying the
attempt to use science and engineering for social problems as a plot
against the oppressed.

I've lived in scary neighborhoods of London. London accepts and
expects a degree of CCTV monitoring that is outrageous to Americans.
Sadly, citizens can't *get* the videos when a crime occurs, and
photographic evidence can be misused against the innocent. Been there,
done that, watched a London parking cop frame the photos they took to
document a parking ticket, really ticked him off when I very obviously
took photos at angles that showed the car was, in fact parked near a
sign that gave permission and curb markings that matched.


Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Queen, Steven Z. (GSFC-5910)
I don't think this list is an appropriate place for political discussions.  
Hopefully an administrator will intervene.
If this continues, I will unsubscribe.


From: owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Nico Kadel-Garcia 

Sent: Tuesday, April 27, 2021 9:04 AM
To: LaToya Anderson 
Cc: Andrew C Aitchison ; Keith Lofstrom 
; Mailing list for Scientific Linux users worldwide 

Subject: [EXTERNAL] Re: Code bias video, watch it ASAP

On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
 wrote:
>
> Data does not remove bias. And one can and should both read the article and 
> watch the movie.
>
> STEM Academy Instructor

Data rather than mere exposition helps prevent bias. How do you refute
or counter unfair bias except with data?

The movie is, itself, profoundly biased. It didn't explore at all why
a public housing project might benefit from cameras on the door of a
densely populated building with numerous poor, old, or unhealthy
tenants. The movie was an icon of "Critical Theory", portraying the
attempt to use science and engineering for social problems as a plot
against the oppressed.

I've lived in scary neighborhoods of London. London accepts and
expects a degree of CCTV monitoring that is outrageous to Americans.
Sadly, citizens can't *get* the videos when a crime occurs, and
photographic evidence can be misused against the innocent. Been there,
done that, watched a London parking cop frame the photos they took to
document a parking ticket, really ticked him off when I very obviously
took photos at angles that showed the car was, in fact parked near a
sign that gave permission and curb markings that matched.


Re: [EXTERNAL] Re: Code bias video, watch it ASAP

2021-04-27 Thread Queen, Steven Z. (GSFC-5910)
I don't think this list is an appropriate place for political discussions.  
Hopefully an administrator will intervene.
If this continues, I will unsubscribe.


From: owner-scientific-linux-us...@listserv.fnal.gov 
 on behalf of Nico Kadel-Garcia 

Sent: Tuesday, April 27, 2021 9:04 AM
To: LaToya Anderson 
Cc: Andrew C Aitchison ; Keith Lofstrom 
; Mailing list for Scientific Linux users worldwide 

Subject: [EXTERNAL] Re: Code bias video, watch it ASAP

On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
 wrote:
>
> Data does not remove bias. And one can and should both read the article and 
> watch the movie.
>
> STEM Academy Instructor

Data rather than mere exposition helps prevent bias. How do you refute
or counter unfair bias except with data?

The movie is, itself, profoundly biased. It didn't explore at all why
a public housing project might benefit from cameras on the door of a
densely populated building with numerous poor, old, or unhealthy
tenants. The movie was an icon of "Critical Theory", portraying the
attempt to use science and engineering for social problems as a plot
against the oppressed.

I've lived in scary neighborhoods of London. London accepts and
expects a degree of CCTV monitoring that is outrageous to Americans.
Sadly, citizens can't *get* the videos when a crime occurs, and
photographic evidence can be misused against the innocent. Been there,
done that, watched a London parking cop frame the photos they took to
document a parking ticket, really ticked him off when I very obviously
took photos at angles that showed the car was, in fact parked near a
sign that gave permission and curb markings that matched.


Re: Code bias video, watch it ASAP

2021-04-27 Thread ~Stack~

On 4/27/21 8:41 AM, LaToya Anderson wrote:
[snip]
Tell me, how many Black people are in this group? And what practices 
have been put into place to ensure you retain Black people? In other 
words, what has been done within this group to check bias to.ensure that 
you have a diverse group of people working together to improve this OS?


As someone with minority heritage myself, I say it doesn't matter. This 
isn't the place to re-imagine history and play political victim games. 
Take that to social media. I don't subscribe to this list for political bs.


I *do* subscribe to this list as a source of technical information 
around an OS that I use where I can be a part of a community comprised 
of people all over the world that helps others regardless of what their 
beliefs or political views. The only colors on this list I see are the 
black text on white background with blue links in my email.


There's a right time and right place for your conversation, and it isn't 
in this list.


As I've heard the saying: "the best way to lose friends is to talk 
politics". How about we just stay on list-topic about how much we 
appreciate Scientific Linux?


~S~


Re: Code bias video, watch it ASAP

2021-04-27 Thread LaToya Anderson
No it doesn't.

Until we all recognize that we all have bias it will continue to be
embedded into the algorithms used to track humans, usually the poor and
disenfranchised who don't have the monetary/political power compared to
those collectively decide what they will or will not allow in their
community.

It is abundantly clear that your bias against the poor has influenced your
worldview of this documentary. I challenge you to not only read her actual
paper, but to look at it from the historical context of how Black people
have been tested since we were kidnapped and forced into the US for chattel
slavery, how the police force began as a means to catch escaped slaves, how
Jim Crow was used to steal wealth while instilling fear in the lives of
Black people, and how, to this day, Black people still live within a
systemically racist system which allows for this ridiculous gaslighting
conversation because how dare a Black woman question the ethics of the
"great" and very very white computer scientists.

Tell me, how many Black people are in this group? And what practices have
been put into place to ensure you retain Black people? In other words, what
has been done within this group to check bias to.ensure that you have a
diverse group of people working together to improve this OS?

I'll wait.

STEM Academy Instructor

On Tue, Apr 27, 2021, 9:05 AM Nico Kadel-Garcia  wrote:

> On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
>  wrote:
> >
> > Data does not remove bias. And one can and should both read the article
> and watch the movie.
> >
> > STEM Academy Instructor
>
> Data rather than mere exposition helps prevent bias. How do you refute
> or counter unfair bias except with data?
>
> The movie is, itself, profoundly biased. It didn't explore at all why
> a public housing project might benefit from cameras on the door of a
> densely populated building with numerous poor, old, or unhealthy
> tenants. The movie was an icon of "Critical Theory", portraying the
> attempt to use science and engineering for social problems as a plot
> against the oppressed.
>
> I've lived in scary neighborhoods of London. London accepts and
> expects a degree of CCTV monitoring that is outrageous to Americans.
> Sadly, citizens can't *get* the videos when a crime occurs, and
> photographic evidence can be misused against the innocent. Been there,
> done that, watched a London parking cop frame the photos they took to
> document a parking ticket, really ticked him off when I very obviously
> took photos at angles that showed the car was, in fact parked near a
> sign that gave permission and curb markings that matched.
>


Re: Code bias video, watch it ASAP

2021-04-27 Thread Nico Kadel-Garcia
On Sun, Apr 18, 2021 at 7:24 AM LaToya Anderson
 wrote:
>
> Data does not remove bias. And one can and should both read the article and 
> watch the movie.
>
> STEM Academy Instructor

Data rather than mere exposition helps prevent bias. How do you refute
or counter unfair bias except with data?

The movie is, itself, profoundly biased. It didn't explore at all why
a public housing project might benefit from cameras on the door of a
densely populated building with numerous poor, old, or unhealthy
tenants. The movie was an icon of "Critical Theory", portraying the
attempt to use science and engineering for social problems as a plot
against the oppressed.

I've lived in scary neighborhoods of London. London accepts and
expects a degree of CCTV monitoring that is outrageous to Americans.
Sadly, citizens can't *get* the videos when a crime occurs, and
photographic evidence can be misused against the innocent. Been there,
done that, watched a London parking cop frame the photos they took to
document a parking ticket, really ticked him off when I very obviously
took photos at angles that showed the car was, in fact parked near a
sign that gave permission and curb markings that matched.


Re: Code bias video, watch it ASAP

2021-04-26 Thread Konstantin Olchanski
Thank you for highlighting this important issue. With recentish New York Times
reports on image recognition software labeling US congressmen as monkeys and
criminal courts using "algorythms" for bail and sentencing decisions, I think
there is cause for severe concern.

All involved with software and computer systems should be aware of such things.

Given the general level of incompetence (see the Boeing 737-MAX airplane 
programmed
to fly into the ground if one external sensor is hit by bird) and the incoming 
era
of robocops firing "only non-lethal" weapons on targets identified "by 
algorithm",
I think there is reason to worry.


K.O.


On Sun, Apr 18, 2021 at 07:24:39AM -0400, LaToya Anderson wrote:
> Data does not remove bias. And one can and should both read the article and
> watch the movie.
> 
> STEM Academy Instructor
> 
> On Sun, Apr 18, 2021, 6:59 AM Nico Kadel-Garcia  wrote:
> 
> > On Sun, Apr 18, 2021 at 6:22 AM Andrew C Aitchison
> >  wrote:
> > >
> > > On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:
> > >
> > > > The movie is very strong on "feels", very poor indeed on data.
> > > >
> > > > A much better article, with far less "feels"
> > >   ^^
> > > Is that a deliberate example of the bias in the video ?
> >
> > No, it's an evaluation of the video and a pointer to a much more
> > succinct, more specific article.
> >

-- 
Konstantin Olchanski
Data Acquisition Systems: The Bytes Must Flow!
Email: olchansk-at-triumf-dot-ca
Snail mail: 4004 Wesbrook Mall, TRIUMF, Vancouver, B.C., V6T 2A3, Canada


Re: Code bias video, watch it ASAP

2021-04-18 Thread LaToya Anderson
Data does not remove bias. And one can and should both read the article and
watch the movie.

STEM Academy Instructor

On Sun, Apr 18, 2021, 6:59 AM Nico Kadel-Garcia  wrote:

> On Sun, Apr 18, 2021 at 6:22 AM Andrew C Aitchison
>  wrote:
> >
> > On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:
> >
> > > The movie is very strong on "feels", very poor indeed on data.
> > >
> > > A much better article, with far less "feels"
> >   ^^
> > Is that a deliberate example of the bias in the video ?
>
> No, it's an evaluation of the video and a pointer to a much more
> succinct, more specific article.
>


Re: Code bias video, watch it ASAP

2021-04-18 Thread Nico Kadel-Garcia
On Sun, Apr 18, 2021 at 6:22 AM Andrew C Aitchison
 wrote:
>
> On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:
>
> > The movie is very strong on "feels", very poor indeed on data.
> >
> > A much better article, with far less "feels"
>   ^^
> Is that a deliberate example of the bias in the video ?

No, it's an evaluation of the video and a pointer to a much more
succinct, more specific article.


Re: Code bias video, watch it ASAP

2021-04-18 Thread LaToya Anderson
Of course it is. That's usually how the conversation goes when bias in tech
is discussed rather than how to develop a solution.

STEM Academy Instructor

On Sun, Apr 18, 2021, 6:22 AM Andrew C Aitchison 
wrote:

> On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:
>
> > The movie is very strong on "feels", very poor indeed on data.
> >
> > A much better article, with far less "feels"
>   ^^
> Is that a deliberate example of the bias in the video ?
>
> --
> Andrew C. Aitchison Kendal, UK
> and...@aitchison.me.uk
>


Re: Code bias video, watch it ASAP

2021-04-18 Thread Andrew C Aitchison

On Sun, 18 Apr 2021, Nico Kadel-Garcia wrote:


The movie is very strong on "feels", very poor indeed on data.

A much better article, with far less "feels"

 ^^
Is that a deliberate example of the bias in the video ?

--
Andrew C. Aitchison Kendal, UK
and...@aitchison.me.uk


Re: Code bias video, watch it ASAP

2021-04-18 Thread Nico Kadel-Garcia
On Sun, Apr 18, 2021 at 4:32 AM Keith Lofstrom  wrote:
>
> This may be off topic, but it concerns the code some of us
> run on servers we build.

It's offtopic. The URL's are hidden by "urldefense.proofpoint.com"
services. The title of the web page is "Coded Bias", from an
organization called "Women Making Movies". The movie is very strong on
"feels", very poor indeed on data.

A much better article, with far less "feels" is at
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.csis.org_blogs_technology-2Dpolicy-2Dblog_problem-2Dbias-2Dfacial-2Drecognition&d=DwIBaQ&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=7j9OwsECMBIooxDxXknoTsda1-2RW_ifSa4nvy0-Y3w&s=A1gWtnpStPhE_6h3bZRGgBB6PYL4vUkLZLK9k8Jb4SI&e=
 .


Code bias video, watch it ASAP

2021-04-18 Thread Keith Lofstrom
This may be off topic, but it concerns the code some of us
run on servers we build.

https://urldefense.proofpoint.com/v2/url?u=https-3A__www.wmm.com_virtual-2Dscreening-2Droom_coded-2Dbias-2Dwatch-2Dpage-2Dcomputer-2Dhistory-2Dmuseum_&d=DwIBAg&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=g-Nr4yHvSMl4fP8CtBXZZoR7Zlih1hmQr25D0pYlBg4&s=gcXXaLIMSnWV3DPPjiPDEJ6KLhqwBYAguhm57REn5zg&e=
 
shortened:  
https://urldefense.proofpoint.com/v2/url?u=https-3A__tinyurl.com_CCOBIAS322&d=DwIBAg&c=gRgGjJ3BkIsb5y6s49QqsA&r=gd8BzeSQcySVxr0gDWSEbN-P-pgDXkdyCtaMqdCgPPdW1cyL5RIpaIYrCn8C5x2A&m=g-Nr4yHvSMl4fP8CtBXZZoR7Zlih1hmQr25D0pYlBg4&s=YD4u7FcFY93gehi1SpeZ4pxnIjCmH64ZXwYnQr7h_Ro&e=
 
password:   CCOBIAS322

White men design pattern recognition software for white men.

Others suffer - black men are harrassed, incarcerated, and
killed because the face recognition software used by police
confuses them with black criminals.  Women are locked out
of their apartment buildings, or fail to get jobs.

This 85 minute video describes these and similar problems.

Much of the video focuses on statist "solutions", but we
might avoid that by demanding competence, transparancy,
and COMPLETE testing, from our colleagues ... and NEVER
EVER using biased pattern recognition training sets.

The video is only free until Monday April 19, so watch
it (and share it) ASAP.

Keith

-- 
Keith Lofstrom  kei...@keithl.com