Re: [NetBehaviour] Free speech

2021-01-10 Thread Max Herman via NetBehaviour

Hi Ruth,

Great points.  The algorithms I think are not so much designed to do things, 
but selected based on their results.  What algorithms keep people looking, keep 
them "in session"?  There is this supposition that such algorithms are "finding 
the user what they want," but what if the user really wants to take a break and 
go for a walk but cannot because they are being manipulated by an algorithm?

Health and wellness are relevant here.  Peter Sterling makes a great argument 
that humans (and most life) evolved to forage and socialize based on small, 
unexpected dopamine rewards.  We walk through meadows with our friends until 
someone finds some berries, and calls out "Hey I found some berries" and we all 
settle in for a lovely berry lunch.  Next day, we repeat, or maybe look for 
apples or try to catch a fish.  We tell jokes and stories.

Yet with industrialization and now digital technology our dopamine "cycle" is 
all out of whack.  Lacking time in nature, with less social safety, and more 
environmental stress, we look to "large dose" sources of dopamine.  I.e., 
instead of a walk in the woods with a fine berry lunch we obtain large amounts 
of processed food.  But it goes for other behaviors as well as food, informing 
all of our behavior paths including visual stimulation, social, verbal, 
chemical, you name it.

Sterling is a student of medicine, and neuroscientist, and argues that for 
health (at all levels) the balance needs to be restored.  In other words, he 
says that surgery and pharma are not the cure.  He calls for preventive 
wellness like diet and exercise, but also the cultural practices which he says 
are essential.  He doesn't know what these are though, and says that artists 
need to help.

As to big data, I think we all will need to come to terms with the deleterious 
health effects which disordered tech and media can cause.  One phrase I think 
of for this is, "social is the new smoking."  Can tech people design 
financially stable algorithms which respect the health of the user?  I would 
imagine yes.  They will do so faster once consumers catch on and demand it.

With great problems come great opportunities, I guess!

Very best,

Max

https://elifesciences.org/articles/36133

"Predictive regulation and human design"



From: NetBehaviour  on behalf of 
Ruth Catlow via NetBehaviour 
Sent: Sunday, January 10, 2021 12:15 PM
To: NetBehaviour for networked distributed creativity 

Cc: Ruth Catlow 
Subject: Re: [NetBehaviour] Free speech

Hmmm,

In addition to the obvious dangers of building global communication systems for 
the profit of platform owners, (whatever good design is - it must prioritise 
delivering profit to shareholders) the problem seems to be that networked 
algorithms have emergent properties.

I saw Tristan Harris, ex Google Designer and now heading up the anti-google 
"designer for humanity" race (yes I have reservations), showing research about 
how social media algorithms, will always push people to look next at the more 
extreme version of the thing they just saw...which results for example in 
directing depressed teenagers from legitimate mental health support communities 
to suicide cults.

I am quite relieved that  Twitter's terms of service mean that Donald Trump can 
be silenced. But it doesn't say much for the state of American democracy that 
their political institutions are unable to deal with such obvious danger.

Akkk!


On Fri, Jan 8, 2021 at 8:09 PM Edward Picot via NetBehaviour 
mailto:netbehaviour@lists.netbehaviour.org>>
 wrote:
I'm genuinely conflicted about it.

It occurs to me to wonder how the algorithms work - if I look at a video about 
conspiracy theories on YouTube, for example, am I then presented with a lot 
more videos about conspiracy theories next time I visit? I think the answer to 
this is probably yes, because I looked at a video of Trump doing his YMCA dance 
(which apparently he does quite frequently at the end of his rallies), thinking 
about re-using it for satirical purposes, and now every time I go to YouTube it 
wants me to look at more videos of Trump dancing.

I think the algorithms are one of the most insidious and damaging aspects of 
Web 2 - instead of genuinely exploring the web and coming across new things, 
which I seem to remember we used to do in the early 2000s, we now find 
ourselves in a commercialised feedback-loop which presents us over and over 
again with amplified (and monetized) versions of whatever beliefs and ideas and 
interests we had in the first place. Perhaps there's some mileage in 
legislating against the algorithms.

Edward


On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
I think some safeguards need to be put into place; if you look at the 
propaganda-machine-work in Nazi Germany, it can do terrible harm. But in the 
U.S. under 

Re: [NetBehaviour] Free speech

2021-01-10 Thread Ruth Catlow via NetBehaviour
Hmmm,

In addition to the obvious dangers of building global communication systems
for the profit of platform owners, (whatever good design is - it must
prioritise delivering profit to shareholders) the problem seems to be that
networked algorithms have emergent properties.

I saw Tristan Harris, ex Google Designer and now heading up the anti-google
"designer for humanity" race (yes I have reservations), showing research
about how social media algorithms, will always push people to look next at
the more extreme version of the thing they just saw...which results for
example in directing depressed teenagers from legitimate mental health
support communities to suicide cults.

I am quite relieved that  Twitter's terms of service mean that Donald Trump
can be silenced. But it doesn't say much for the state of American
democracy that their political institutions are unable to deal with such
obvious danger.

Akkk!


On Fri, Jan 8, 2021 at 8:09 PM Edward Picot via NetBehaviour <
netbehaviour@lists.netbehaviour.org> wrote:

> I'm genuinely conflicted about it.
>
> It occurs to me to wonder how the algorithms work - if I look at a video
> about conspiracy theories on YouTube, for example, am I then presented with
> a lot more videos about conspiracy theories next time I visit? I think the
> answer to this is probably yes, because I looked at a video of Trump doing
> his YMCA dance (which apparently he does quite frequently at the end of his
> rallies), thinking about re-using it for satirical purposes, and now every
> time I go to YouTube it wants me to look at more videos of Trump dancing.
>
> I think the algorithms are one of the most insidious and damaging aspects
> of Web 2 - instead of genuinely exploring the web and coming across new
> things, which I seem to remember we used to do in the early 2000s, we now
> find ourselves in a commercialised feedback-loop which presents us over and
> over again with amplified (and monetized) versions of whatever beliefs and
> ideas and interests we had in the first place. Perhaps there's some mileage
> in legislating against the algorithms.
>
> Edward
>
>
> On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
>
> I think some safeguards need to be put into place; if you look at the
> propaganda-machine-work in Nazi Germany, it can do terrible harm. But in
> the U.S. under Reagen, the fairness doctrine was scrapped, which meant
> local news outlets of all sorts could be grabbed up by opinionated
> multi-nationals, and you get people like Rush Linbaugh spreading hatred
> unchallenged in rural areas - probably the biggest swatch of territory in
> the country. That's where "these people" get their news, unchallenged. It's
> far-right-wing money. I also think hate speech might be covered more
> directly - one of the tshirts at the riot said in abbreviated form - 6
> million is not enough. What do you do with that?
>
> Best, Alan (mind you I've been censored on YouTube and elsewhere myself, I
> think unfairly, so you might make a counter-argument that it's all in the
> eye/ear of the beholder. It's an aporia.)
>
> On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour <
> netbehaviour@lists.netbehaviour.org> wrote:
>
>> What do people think - have we reached the point at which social media
>> companies should be prosecuted for allowing hate-speech, incitements to
>> violence, demonstrable untruths and conspiracy theories to be uploaded
>> onto their sites?
>>
>> Should they be regarded as publishers, and therefore legally responsible
>> for their content?
>>
>> I'm genuinely torn, but I think maybe we've now reached that point. I'd
>> be very interested to hear what others think.
>>
>> Edward
>>
>> ___
>> NetBehaviour mailing list
>> NetBehaviour@lists.netbehaviour.org
>> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>>
>
>
> --
> *=*
>
> *directory http://www.alansondheim.org  tel
> 718-813-3285 **email sondheim ut panix.com , sondheim
> ut gmail.com *
> *=*
>
> ___
> NetBehaviour mailing 
> listNetBehaviour@lists.netbehaviour.orghttps://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>
>
> ___
> NetBehaviour mailing list
> NetBehaviour@lists.netbehaviour.org
> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>


-- 
Co-founder & Artistic director of Furtherfield & DECAL Decentralised Arts
Lab
+44 (0) 77370 02879

*I will only agree to speak at events that are racially and gender
balanced.

**sending thanks

in
advanc

Re: [NetBehaviour] Free speech

2021-01-08 Thread Alan Sondheim via NetBehaviour
It's also related to "filter bubbles" - trying to push content that we
appear to be interested in, and not allowing problematic or contradictory
materials in. And we can't turn this off at all. It's a push phenomenology,
not a pull one; it resides with the corporate and god knows what else, not
with us. There are times I think it's dangerous to click on X, because it
will suddenly "blossom" in the feed; the only way around this would be to
have multiple machines and multiple accounts...

All of us need to deal with this!

Best, Alan

On Fri, Jan 8, 2021 at 3:10 PM Edward Picot via NetBehaviour <
netbehaviour@lists.netbehaviour.org> wrote:

> I'm genuinely conflicted about it.
>
> It occurs to me to wonder how the algorithms work - if I look at a video
> about conspiracy theories on YouTube, for example, am I then presented with
> a lot more videos about conspiracy theories next time I visit? I think the
> answer to this is probably yes, because I looked at a video of Trump doing
> his YMCA dance (which apparently he does quite frequently at the end of his
> rallies), thinking about re-using it for satirical purposes, and now every
> time I go to YouTube it wants me to look at more videos of Trump dancing.
>
> I think the algorithms are one of the most insidious and damaging aspects
> of Web 2 - instead of genuinely exploring the web and coming across new
> things, which I seem to remember we used to do in the early 2000s, we now
> find ourselves in a commercialised feedback-loop which presents us over and
> over again with amplified (and monetized) versions of whatever beliefs and
> ideas and interests we had in the first place. Perhaps there's some mileage
> in legislating against the algorithms.
>
> Edward
>
>
> On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
>
> I think some safeguards need to be put into place; if you look at the
> propaganda-machine-work in Nazi Germany, it can do terrible harm. But in
> the U.S. under Reagen, the fairness doctrine was scrapped, which meant
> local news outlets of all sorts could be grabbed up by opinionated
> multi-nationals, and you get people like Rush Linbaugh spreading hatred
> unchallenged in rural areas - probably the biggest swatch of territory in
> the country. That's where "these people" get their news, unchallenged. It's
> far-right-wing money. I also think hate speech might be covered more
> directly - one of the tshirts at the riot said in abbreviated form - 6
> million is not enough. What do you do with that?
>
> Best, Alan (mind you I've been censored on YouTube and elsewhere myself, I
> think unfairly, so you might make a counter-argument that it's all in the
> eye/ear of the beholder. It's an aporia.)
>
> On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour <
> netbehaviour@lists.netbehaviour.org> wrote:
>
>> What do people think - have we reached the point at which social media
>> companies should be prosecuted for allowing hate-speech, incitements to
>> violence, demonstrable untruths and conspiracy theories to be uploaded
>> onto their sites?
>>
>> Should they be regarded as publishers, and therefore legally responsible
>> for their content?
>>
>> I'm genuinely torn, but I think maybe we've now reached that point. I'd
>> be very interested to hear what others think.
>>
>> Edward
>>
>> ___
>> NetBehaviour mailing list
>> NetBehaviour@lists.netbehaviour.org
>> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>>
>
>
> --
> *=*
>
> *directory http://www.alansondheim.org  tel
> 718-813-3285 **email sondheim ut panix.com , sondheim
> ut gmail.com *
> *=*
>
> ___
> NetBehaviour mailing 
> listNetBehaviour@lists.netbehaviour.orghttps://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>
>
> ___
> NetBehaviour mailing list
> NetBehaviour@lists.netbehaviour.org
> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>


-- 
*=*

*directory http://www.alansondheim.org  tel
718-813-3285**email sondheim ut panix.com , sondheim ut
gmail.com *
*=*
___
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour


Re: [NetBehaviour] Free speech

2021-01-08 Thread Edward Picot via NetBehaviour

I'm genuinely conflicted about it.

It occurs to me to wonder how the algorithms work - if I look at a video 
about conspiracy theories on YouTube, for example, am I then presented 
with a lot more videos about conspiracy theories next time I visit? I 
think the answer to this is probably yes, because I looked at a video of 
Trump doing his YMCA dance (which apparently he does quite frequently at 
the end of his rallies), thinking about re-using it for satirical 
purposes, and now every time I go to YouTube it wants me to look at more 
videos of Trump dancing.


I think the algorithms are one of the most insidious and damaging 
aspects of Web 2 - instead of genuinely exploring the web and coming 
across new things, which I seem to remember we used to do in the early 
2000s, we now find ourselves in a commercialised feedback-loop which 
presents us over and over again with amplified (and monetized) versions 
of whatever beliefs and ideas and interests we had in the first place. 
Perhaps there's some mileage in legislating against the algorithms.


Edward


On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
I think some safeguards need to be put into place; if you look at the 
propaganda-machine-work in Nazi Germany, it can do terrible harm. But 
in the U.S. under Reagen, the fairness doctrine was scrapped, which 
meant local news outlets of all sorts could be grabbed up by 
opinionated multi-nationals, and you get people like Rush Linbaugh 
spreading hatred unchallenged in rural areas - probably the biggest 
swatch of territory in the country. That's where "these people" get 
their news, unchallenged. It's far-right-wing money. I also think hate 
speech might be covered more directly - one of the tshirts at the riot 
said in abbreviated form - 6 million is not enough. What do you do 
with that?


Best, Alan (mind you I've been censored on YouTube and elsewhere 
myself, I think unfairly, so you might make a counter-argument that 
it's all in the eye/ear of the beholder. It's an aporia.)


On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour 
> wrote:


What do people think - have we reached the point at which social
media
companies should be prosecuted for allowing hate-speech,
incitements to
violence, demonstrable untruths and conspiracy theories to be
uploaded
onto their sites?

Should they be regarded as publishers, and therefore legally
responsible
for their content?

I'm genuinely torn, but I think maybe we've now reached that
point. I'd
be very interested to hear what others think.

Edward

___
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org

https://lists.netbehaviour.org/mailman/listinfo/netbehaviour



--
/=/
/directory http://www.alansondheim.org tel 718-813-3285
//email sondheim ut panix.com , sondheim ut 
gmail.com /

/=/

___
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour



___
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour


Re: [NetBehaviour] Free speech

2021-01-08 Thread Alan Sondheim via NetBehaviour
I think some safeguards need to be put into place; if you look at the
propaganda-machine-work in Nazi Germany, it can do terrible harm. But in
the U.S. under Reagen, the fairness doctrine was scrapped, which meant
local news outlets of all sorts could be grabbed up by opinionated
multi-nationals, and you get people like Rush Linbaugh spreading hatred
unchallenged in rural areas - probably the biggest swatch of territory in
the country. That's where "these people" get their news, unchallenged. It's
far-right-wing money. I also think hate speech might be covered more
directly - one of the tshirts at the riot said in abbreviated form - 6
million is not enough. What do you do with that?

Best, Alan (mind you I've been censored on YouTube and elsewhere myself, I
think unfairly, so you might make a counter-argument that it's all in the
eye/ear of the beholder. It's an aporia.)

On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour <
netbehaviour@lists.netbehaviour.org> wrote:

> What do people think - have we reached the point at which social media
> companies should be prosecuted for allowing hate-speech, incitements to
> violence, demonstrable untruths and conspiracy theories to be uploaded
> onto their sites?
>
> Should they be regarded as publishers, and therefore legally responsible
> for their content?
>
> I'm genuinely torn, but I think maybe we've now reached that point. I'd
> be very interested to hear what others think.
>
> Edward
>
> ___
> NetBehaviour mailing list
> NetBehaviour@lists.netbehaviour.org
> https://lists.netbehaviour.org/mailman/listinfo/netbehaviour
>


-- 
*=*

*directory http://www.alansondheim.org  tel
718-813-3285**email sondheim ut panix.com , sondheim ut
gmail.com *
*=*
___
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour