Bias in a classifier AI functions somewhat like glitch in a media encoding
system: until someone hits the configuration that reveals it, we don't see
it. The bias in reading emotional states is compelling, but what also gets
my attention in these images is the determination of gender. Just what is
that flags a face as male or female, for an AI, and by extension, for a
culture of images? And by extension, what are the cultural determinants
that require individuals to differentiate their appearance, or face
consequences for not doing so?

Provocative work.

// Paul


On Fri, May 5, 2023 at 2:18 AM Amy Alexander <stu...@plagiarist.org> wrote:

> Tickled to announce my still image work, Deep Hysteria!
> "AI" meets human gender perceptions.
> https://amy-alexander.com/projects/deep-hysteria/
> (More at: https://medium.com/@amyjalexander/deep-hysteria-46ff1650f502  )
> Currently exhibiting online, soon to be available in print form for
> museums, galleries, and other repositories of vertical surfaces!
>
> Also the B-Side: "I asked AI to fix the world"
> https://medium.com/@amyjalexander/i-asked-an-ai-for-c24e3cab856a
>
> _______________________________________________
> NetBehaviour mailing list
> NetBehaviour@lists.netbehaviour.org
> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>


-- 
-----   |(*,+,#,=)(#,=,*,+)(=,#,+,*)(+,*,=,#)|   ---
http://paulhertz.net/
_______________________________________________
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour

Reply via email to