I'm genuinely conflicted about it.

It occurs to me to wonder how the algorithms work - if I look at a video about conspiracy theories on YouTube, for example, am I then presented with a lot more videos about conspiracy theories next time I visit? I think the answer to this is probably yes, because I looked at a video of Trump doing his YMCA dance (which apparently he does quite frequently at the end of his rallies), thinking about re-using it for satirical purposes, and now every time I go to YouTube it wants me to look at more videos of Trump dancing.

I think the algorithms are one of the most insidious and damaging aspects of Web 2 - instead of genuinely exploring the web and coming across new things, which I seem to remember we used to do in the early 2000s, we now find ourselves in a commercialised feedback-loop which presents us over and over again with amplified (and monetized) versions of whatever beliefs and ideas and interests we had in the first place. Perhaps there's some mileage in legislating against the algorithms.

Edward


On 08/01/2021 19:16, Alan Sondheim via NetBehaviour wrote:
I think some safeguards need to be put into place; if you look at the propaganda-machine-work in Nazi Germany, it can do terrible harm. But in the U.S. under Reagen, the fairness doctrine was scrapped, which meant local news outlets of all sorts could be grabbed up by opinionated multi-nationals, and you get people like Rush Linbaugh spreading hatred unchallenged in rural areas - probably the biggest swatch of territory in the country. That's where "these people" get their news, unchallenged. It's far-right-wing money. I also think hate speech might be covered more directly - one of the tshirts at the riot said in abbreviated form - 6 million is not enough. What do you do with that?

Best, Alan (mind you I've been censored on YouTube and elsewhere myself, I think unfairly, so you might make a counter-argument that it's all in the eye/ear of the beholder. It's an aporia.)

On Fri, Jan 8, 2021 at 2:07 PM Edward Picot via NetBehaviour <netbehaviour@lists.netbehaviour.org <mailto:netbehaviour@lists.netbehaviour.org>> wrote:

    What do people think - have we reached the point at which social
    media
    companies should be prosecuted for allowing hate-speech,
    incitements to
    violence, demonstrable untruths and conspiracy theories to be
    uploaded
    onto their sites?

    Should they be regarded as publishers, and therefore legally
    responsible
    for their content?

    I'm genuinely torn, but I think maybe we've now reached that
    point. I'd
    be very interested to hear what others think.

    Edward

    _______________________________________________
    NetBehaviour mailing list
    NetBehaviour@lists.netbehaviour.org
    <mailto:NetBehaviour@lists.netbehaviour.org>
    https://lists.netbehaviour.org/mailman/listinfo/netbehaviour



--
/=====================================================/
/directory http://www.alansondheim.org tel 718-813-3285
//email sondheim ut panix.com <http://panix.com>, sondheim ut gmail.com <http://gmail.com>/
/=====================================================/

_______________________________________________
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour


_______________________________________________
NetBehaviour mailing list
NetBehaviour@lists.netbehaviour.org
https://lists.netbehaviour.org/mailman/listinfo/netbehaviour

Reply via email to