Social networks, like those developed by Facebook and Google, have tremendous power to allow people around the world to connect and share. I’ve been talking about this for way more than a decade, at times acting as an enthusiastic cheerleader for the positive effects on business and society. However, the increasing reliance of social networking companies on algorithms to determine what we see in our feeds has become a tremendous problem.
I’m increasingly pessimistic about the social networks’ power over our lives.
In particular, I believe the Facebook algorithm has gone way beyond simply showing us what we want to see to become a polarizing, negative force in the world. Sometimes I even find myself using the word “evil” when I think about the Facebook algorithms.
Going into 2021, we all need to educate ourselves on what’s happening and take a stand on what we feel about this important issue.
How AI Algorithms Power Your Netflix Feed
Many people push back on the idea that the algorithms from companies like Facebook can be evil. “What’s so bad, David?” they ask.
Artificial intelligence algorithms are complex math formulas applied to billions of anonymized data points from hundreds of millions of people. Sounds innocuous, right?
However, when the goal of a social network such as Facebook is to sell ads and make money, algorithms aren’t tuned to show you what you want, they are primed to make money for shareholders.
It was with Netflix that I first started to get wise to and subsequently annoyed by algorithms.
At first, I thought the Netflix user interface was magical, showing me titles that sounded great which I promptly watched. I remember this initially being a positive for me as I binged on rock documentaries when I first subscribed.
Here’s what Netflix says about how they do this: “Our business is a subscription service model that offers personalized recommendations, to help you find shows and movies of interest to you. To do this we have created a proprietary, complex recommendations system. Whenever you access the Netflix service, our recommendations system strives to help you find a show or movie to enjoy with minimal effort.”
However, since first subscribing, the Netflix recommendation engine has become a negative for me. And it cannot be turned off.
Now, I find it super hard to browse titles without the recommendation engine getting in the way. I have eclectic tastes! Yes, I like rock docs, but not that much! I enjoy “The Crown” but that doesn’t mean I want to watch every British historical drama that exists!
These days, I find it too difficult to make the Netflix user interface show me movies and series outside of the types I have already watched.
With a service like Netflix, the AI algorithm is frustrating.
However, when applied to newsfeeds like Facebook’s, AI algorithms become way more problematic.
Facebook AI Algorithms And News
For centuries, people have been getting their news via human curation. Humans choose what news goes onto the front page of a printed newspaper or the pages of a print magazine. Humans figure out what news belongs on the radio and on television.
Same is true for online news. I choose what to post on this blog and human editors choose what to put on the homepage of a news site.
Early in my career, I worked for six years at Knight-Ridder, at the time the second largest newspaper company in the U.S. I was Asia Marketing Director, in charge of developing financial markets news and data products and services to be sold worldwide.
During my time working at Knight-Ridder, I saw the care that went into collecting and distributing news and the importance of the dedicated reporters and editors. There were experts in their field making decisions about what customers would see.
I remember conversations with Lewis M. Simons who sat near me in the Knight-Ridder Tokyo office. Prior to my joining the company, he received the 1986 Pulitzer Prize in International Reporting for exposing the Marcos billions looted from the Philippines. Powerful work. Could an algorithm have uncovered this story? Of course not.
The way stories are selected to appear for individual news consumers changed dramatically with the AI powered news feed from platforms like Facebook and similar offerings from other social media companies such as Google’s YouTube and Google News offerings.
Getting news from social media sites is an increasingly common experience and the Facebook news feed algorithm helps determine what more than two billion people see every day.
According to a report from The Pew Research Center, more than half of U.S. adults get news from social media often or sometimes (55%).
Facebook is far and away the social media site that Americans use most commonly for news. About half (52%) of all U.S. adults get news there. The next most popular social media site for news is YouTube (28% of adults get news there), followed by Twitter (17%) and Instagram (14%). A number of other social media platforms (including LinkedIn, Reddit and Snapchat) have smaller news audiences.
The problem is that the Facebook AI algorithms work in the exact same way as that frustrating Netflix one I talked about.
If somebody clicks on a sensational headline once because they are curious, the feed learns a little bit from that interaction and serves up similar stories. If that new content is also viewed, then the system serves up more and more similar content.
The algorithm may then begin to serve up false stories and conspiracy theories based on those clicks. Just like the way that Netflix amplifies the kinds of movies and shows we watch, Facebook shows us the same kind of news we click on.
I notice this myself. When I am logged into Google while using Google News, the algorithms show me many more stories similar to ones I have clicked on in the past. As well, the Google News app tends to show me stories from sources I have viewed before rather than showing me the most popular story overall.
These “features” tend to mean that people get exposed to the news and the media companies that they have already used.
It’s a huge problem when everybody gets their news this way.
I understand what’s happening because I deeply understand how news content is created, but the average news consumer does not. I’ve solved this by simply using Google News anonymously.
AI algorithms from Facebook, Google and the like amplify false and misleading information, polarizing content, and conspiracy theories because it’s just the computer systems doing what they’ve been trained to do.
It’s in the interests of social networks like Facebook to get us to stay on the site as long as possible so we can be fed advertising, which is how they make money.
I find it interesting that Facebook employees, among the most well-paid workers on the planet, are increasingly worried about this trend too and are beginning to speak out.
According to BuzzFeed News, Facebook released results of an internal survey recently that revealed a stark decline in employee confidence over the past six months. Its semi-annual “Pulse Survey,” taken by more than 49,000 Facebook employees over two weeks in October 2020, showed only 51% of respondents said they believed that Facebook was having a positive impact on the world, down from 74% in the company's last survey in May 2020. Wow!
The Social Dilemma
I’m certainly not the only one talking about these ideas. Earlier this year, a fabulous film called The Social Dilemma was released.
The Social Dilemma features the voices of technologists, researchers and activists working to align technology with the interests of humanity.
“The world has long recognized the positive applications of social media, from its role in empowering protesters to speak out against oppression during the Arab Spring uprisings almost a decade ago, to serving an instrumental role in fighting for equity and justice today. And in 2020, during an astonishing global pandemic, social media has become our lifeline to stay in touch with loved ones, as well as proving to be an asset for mobilizing civil rights protests. However, the system that connects us also invisibly controls us. The collective lack of understanding about how these platforms actually operate has led to hidden and often harmful consequences to society—consequences that are becoming more and more evident over time, and consequences that, the subjects in The Social Dilemma suggest, are an existential threat to humanity.”
If you haven’t seen it yet, check out the trailer and make time to watch the entire film.
As I was watching the film (yep, on Netflix), they mentioned the old trope “If you aren’t paying for a product like Facebook, then you are the product.” I had been thinking the same thing.
Then, as if on cue, Jaron Lanier said on screen what is my favorite quote in the film, a slight but important tweak on the old trope: “It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product.”
Hang On! Does Facebook Change Our Behavior?
Yes. I and many other people believe Facebook does change our behavior.
Many people go deep down rat holes of news on Facebook and emerge as passionate believers in conspiracy theories.
Facebook commented on the content of the film in a post titled What ‘The Social Dilemma’ Gets Wrong.
“Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems. The film’s creators do not include insights from those currently working at the companies or any experts that take a different view to the narrative put forward by the film. They also don’t acknowledge—critically or otherwise—the efforts already taken by companies to address many of the issues they raise. Instead, they rely on commentary from those who haven’t been on the inside for many years.”
The post goes on to detail a seven-point rebuttal to the ideas in the film.
Racial And Gender Bias In Marketing Introduced By AI Algorithms
It’s not just news that is affected by AI algorithms. These computer programs also have the potential to magnify the biases that marketers introduce or that already exist in the applications that we marketers use. In particular, racial and gender bias are problems, a topic I covered earlier this year.
For example, a University of Washington study looked at the images surfaced during searches in online image catalogs. The study found that for a CEO image search, only 11 percent of the top image results for “CEO” showed women, whereas women were 27 percent of US CEOs at the time, according to the study. Twenty-five percent of people depicted in image search results for “authors” are women, compared with 56 percent of actual U.S. authors.
“In short, we are creating the perfect storm against persons of color and other underrepresented populations,” says Miriam Vogel, President and CEO of EqualAI, a nonprofit organization and movement focused on illuminating and reducing unconscious bias in the development and use of artificial intelligence.
My 2021 Digital Marketing Prediction: A Coming Backlash Against Social Media Algorithms
The problems I outline here are massive and I am just scratching the surface of the problems in this post. As the AI programs learn more and more about each of the more than two billion of us who use them, the problem will only get worse.
The potential for backlash could be harmful to all of us who use social networks and who market using these services.
Yes, Facebook and the others are public companies chartered to make money for their investors. I get that. I’m certainly not advocating that they are shut down.
But some people are talking about that. A danger is that the government could try to sort things out in a way that makes things worse such as banning social networks like President Trump tried to do with TikTok.
Companies like Facebook and the nearly 50,000 people who work there are also a part of our society and should be considering the tradeoff between billions in profit vs. the negative effects their products have on all of us.
In 2021, it will take a coordinated effort by all of us, including marketers, corporate executives, government officials, and members of the media to figure out how to handle the situation.
Here are a few things you can do right now:
- Educate yourself on how the algorithms work. One of the easiest ways to do this is to log into Google News and use it several times a day for a few weeks. Each time you log in, click on stories that interest you. Then after a few weeks, see what the “Recommendations based on your interests” section looks like. Then, open a different browser with no personalization such as Firefox using a “Private Window” and compare the Google News feed without personalization to what you are seeing when logged in.
- Talk about these ideas with your friends, colleagues, and family members. If you’ve read this far, it means you’ve found this blog post and you are clearly interested in the topic. However most people are ignorant. Talk it over. Watch The Social Dilemma with your family or have your colleagues all watch it and then hold a Zoom meeting to discuss the ideas.
- Be aware of racial, gender, and sexual orientation bias in your organization's marketing. Check out the EqualAI organization and consider joining or donating.
- Speak out when you see something that’s out of place.
- Marketers spending money on advertising with the social networks have a unique position since we are the ones generating their revenue. While it wouldn't be easy to do, a boycott or something similar by marketers could be a dramatic way to get the world's attention. If public companies, for example, were to insert statements of protest into their annual reports describing how they have cut back on Facebook advertising, that would be powerful indeed.
We all have a role to play.