*AIs should have the same ethical protections as animals*

*John Basl is assistant professor of philosophy at Northeastern University 
in Boston*

https://aeon.co/ideas/ais-should-have-the-same-ethical-protections-as-animals

...

A puzzle and difficulty arises here because the scientific study of 
consciousness has not reached a consensus about what consciousness is, and 
how we can tell whether or not it is present. On some views – ‘liberal’ 
views – for consciousness to exist requires nothing but a certain type of 
well-organised information-processing, such as a flexible informational 
model of the system in relation to objects in its environment, with guided 
attentional capacities and long-term action-planning. We might be on the 
verge of creating such systems already. On other views – ‘conservative’ 
views – consciousness might require very specific biological features, such 
as a brain very much like a mammal brain in its low-level structural 
details: in which case we are nowhere near creating artificial 
consciousness.

It is unclear which type of view is correct or whether some other 
explanation will in the end prevail. However, if a liberal view is correct, 
we might soon be creating many subhuman AIs who will deserve ethical 
protection. There lies the moral risk.

Discussions of ‘AI risk’ normally focus on the risks that new AI 
technologies might pose to us humans, such as taking over the world and 
destroying us, or at least gumming up our banking system. Much less 
discussed is the ethical risk we pose to the AIs, through our possible 
mistreatment of them.

...

 

My 'conservative' view: information processing (alone) does not achieve 
experience (consciousness) processing.

https://codicalist.wordpress.com/2018/10/14/experience-processing/


-
@philipthrift <https://twitter.com/philipthrift>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to