A premise of the democratic process, the political vote, is that the People can choose their leaders, to best advance the nation's interests.
Such notion is based on some second millennium assumptions that no longer apply.
For example, that while the electorate's second millennium information feed may have demonstrated a partisan political bias, media spin was reality based.
Pioneering digital forensic expert Hany Farid, University of California, Berkeley provides update via discussion with PBS Amna Nawaz,
how & why disinformation spreads online, how to find reliable sources of information ...
HANY FARID:
The thing you have to understand about social media is, not only is it -- doesn't care about real, fake, true, lies.
In fact, it actually prefers algorithmically the spread of mis- and dis-information because that's what leads to user engagement.
So the algorithms have learned how to spread the most salacious, outrageous conspiratorial content because that's what the billions of people online click on.
... the lies spread much, much faster than the truth, which of course adds a whole 'nother complexity to the speed with which we have to respond and also the consequences for getting it wrong.
AMNA NAWAZ:
It's not just that people are spreading these things, because they do.
It's not a coincidence rage bait was the word of the year last year, right?
But it's that the algorithms actually prefer them over real information or real images.
Is that right?
HANY FARID:
That's 100 percent right.
And the reason, of course, is because the business model of social media, think X, Facebook, Instagram, TikTok, et cetera, is user engagement.
The more you click, the more ads we deliver, the more money they make.
And so the algorithms, they didn't set out to burn the place to the ground.
They didn't set out to do that.
It was learned.
And you could blame the social media giants for this, and I think we should.
But, at the end of the day, we're the ones clicking on those posts.
We are the ones teaching the machines that this is what we will engage with.
And so, yes, it's learned that, well, when the user clicks on this, give them more of this.
And rage bait works.
Clickbait works.
We click on it.
And so we have to return to our trusted sources.
We have to understand that people like you [journalists] are really -- work really hard to figure out what is going on in the world.
They talk to people like me to understand it and to bring that information to you. ...
... people have to understand that social media is not designed as a reliable source of information.
It's not.
It never has, and it never will.
AMNA NAWAZ:
The majority of Americans do get their news and information from social media.
That's where we are right now.
What you're calling for is an enormous cultural shift, the likes of which we are definitely not trending towards, right?
So, just culturally, short of even policy guidelines or companies and CEOs completely changing how they do their work, how does that kind of thing start?
HANY FARID:
Yes.
So, first, I'm not naive about this is a massive cultural -- not just here in the United States, but globally. ...
... Now, the good news is, I think there's some -- at least a glimmer of hope in the horizon.
So, if you look, for example, today, there are massive litigations happening around social media and addictive properties and the impact of children in a way that I think 10 years ago I didn't think we would see these cases.
And so there is movement.
Australia has banned social media for kids under the age of 16.
The E.U. and the U.K. and other parts of the world are considering similar legislation.
I think there is an awakening that, while there are positive aspects to these technologies, to social media, it is clear the harms are unambiguous.
It will take a lot of conversations.
It will take a lot of serious people thinking about this in a serious way.
And it will take fighting back against massive, massive global corporate interests.
But I don't know what the other option is.
AMNA NAWAZ:
And you can watch that full conversation and all the episodes of "Settle In" on our YouTube channel or wherever you get your podcasts.
On 'Settle In,' Hany Farid and Amna Nawaz discuss spotting manipulated images
www.pbs.org
Farid warns us, smartphones may not be intrinsically evil.
But the "free" cyber world is sponsored, financed substantially by marketeers competing for your attention (thus "click bait") and your $money.
Farid acknowledges the long scroll may not have originated with predatory intent.
But that smartphone users, social media users have taught these systems what works best FOR THESE SOCIAL MEDIA SPONSORS.
Mark Zuckerberg is a $Billionaire.
So what is
your favorite click-bait? In what ways do
you reinforce, reward this status quo? Marshall McLuhan ?