Facebook AI Algorithm: One Of The Most Destructive Technologies Ever Invented

I write about strategies to turn fans into customers and customers into fans. I also share ways to use real-time strategies to spread ideas, influence minds, and build business.

Social Media  |  Facebook  |  Research and Analysis  |  Newsjacking

Facebook EvilThe Facebook Artificial Intelligence powered algorithm is designed to suck users into the content that interests them the most. The technology is tuned to serve up more and more of what you click on, be that yoga, camping, Manchester United, or K-pop. That sounds great, right?

However, the Facebook algorithm also leads tens of millions of its 2.7 billion global users into an abyss of misinformation, a quagmire of lies, and a quicksand of conspiracy theories.

In early December, I wrote a post titled 2021 Digital Marketing Prediction: Backlash Against Social Media Algorithms. In it, I outlined the increasing reliance of social networking companies on algorithms - complex math formulas applied to billions of anonymized data points from hundreds of millions of people - to determine what we see in our feeds. I said then that I’m increasingly pessimistic about the social networks’ power over our lives.

The post generated a bunch of reaction from readers and I’ve been giving the ideas I wrote about a lot of additional thought.

Then, in the wake of the events of January 6, I decided to write this follow up post.


Facebook has become the primary news source in the United States

Facebook started out as a happy place to connect with your friends and contacts from school and work. Over the past decade, as Facebook became a public company with investors, executives have been motivated by the profit that goes with selling advertising services to companies that want to reach their users.

Since then, Facebook has morphed into a company relying on one of the most destructive technologies humans have ever created, directly responsible for wrecked lives and our disrupted political system.

While other social networks are also powered by algorithms, Facebook is by far the biggest problem because of its sheer size, with nearly half the planet’s adult population on the platform.

According to a report from The Pew Research Center, more than half (52%) of all U.S. adults get news on Facebook.

Think about that for a moment!

US census data pegs the total population at 328 million in 2019, with 22 percent under the age of 18. That means that of the about 254 million American adults 132 million get news from Facebook. This number is way more than that of other social networks like Twitter which is why I am so focused on the destructive nature of Facebook.

Put another way, 132 million people get their daily news served up by Facebook based on whatever subjects (often including misinformation and conspiracy theories) each user begins to get interested in.

Contrast the number of people who get news from Facebook with those who get news from mainstream media. Fox News, which finished 2020 as the most-watched cable news channel in history, averaged only 3.6 million daily viewers, a tiny fraction of the number who get news on Facebook. The top newspaper in the United States by circulation, USA Today, had just 1.6 million average weekly paid readers.


Facebook is built to get you to stay on the site longer

As a publicly-traded company that makes money by selling ads, it is in the best interest of Facebook’s shareholders for the company’s technology to be tuned to keep you on the site as long as possible. When you are on Facebook, they make money.

Whatever interests you, based on what you search for or click on, is what the Facebook network will learn that you will be interested in seeing more of. And the technology will give you more and more like a drug.

Are you a bird watcher who joins a few birder groups, clicks “like” on the cute short-eared owl photo from your friend, searches on nearby bird sanctuaries, and shares a video of a majestic egret in flight? Bingo! Facebook will show you more and more bird content.

Just like the way that Netflix amplifies the kinds of movies and shows we watch, prompting us to watch more of the same, Facebook shows us the same kind of information in our news feed that we click on, encouraging us to view more.

If somebody clicks on a sensational headline once because they are curious, the feed learns a little bit from that interaction and serves up similar stories. If that new content is also viewed, then the system serves up more and more similar content.

That’s why it is so easy for AI algorithms from Facebook to amplify false and misleading information, polarizing content, and conspiracy theories - it’s what the systems have been trained to do!


Facebook serves up news based on interest, not choice

Many people talk about partisan news outlets as a major problem driving Americans apart.

Yes, the ideological focus of MSNBC is different from FOX and The New York Times leans in a different direction from The Wall Street Journal.

However, there is a massive difference between choosing to get your news from a cable TV network or a newspaper vs. having the news appear on your Facebook feed. The element of choice is removed with Facebook because the news comes to you based on what you clicked on in the past rather than what you choose to click on now.

This becomes a huge problem for those who are curious enough to dip their toes into a conspiracy theory.

Facebook traps people.

Imagine you “like” a story sent by a dear friend about how the Apollo lunar landings were faked, perhaps not even reading the story. Later, you get another link from Facebook on the theme of faked moon landings and it makes you wonder… Maybe you then search Facebook for more stories about the Apollo missions being faked because you are curious. You’re in! Facebook has now trapped you. You will get more and more content based on what you are beginning to (falsely) believe to be true.

As soon as people start seeing lots and lots of similar false stories appear in their Facebook newsfeed, they begin to “know this is true because everybody says so.”

An aside. I’ve spoken with several of the brave men who walked on the moon about the idea that what they accomplished was faked and wanted to share a few of the more memorable ways they push back on conspiracy theorists.

I co-wrote a book called Marketing the Moon with my friend Rich Jurek and Gene Cernan, the last man to walk on the moon, wrote the forward. Gene told us: “The Russians knew we did it. If we had faked it, they would have called us out.”

And I clearly remember Charlie Duke, the tenth man to walk on the moon say to me: “If we faked the moon landings, why would we do it six times!”


Facebook banning particular types of content and certain people helps but is not the best solution

In the past few weeks, Facebook has made a few good decisions regarding the users and the content on its site. However, these moves are designed to make it look like Facebook is “doing something about the problem.” I believe this is an attempt by Facebook to deflect people from the real problem of the algorithms.

For example, on January 11, in a news release titled Our Preparations Ahead of Inauguration Day, Facebook announced they “are banning ads that promote weapon accessories and protective equipment in the US at least through January 22, out of an abundance of caution. We already prohibit ads for weapons, ammunition, and weapon enhancements like silencers. But we will now also prohibit ads for accessories such as gun safes, vests, and gun holsters in the US.”

Facebook also banned tens of thousands of individual users and groups from the platform because of violations of the Facebook terms of service.

The whack-a-mole process of simply looking for the worst offenders certainly helps, but it doesn’t address the way that the Facebook systems actually work and what billions of users see each day.


With awesome power comes great responsibility

Because of the nature of Facebook’s product, the answers that have been floated by politicians are not going to work.

Many are advocating breaking up the company by separating out Facebook-owned services like Instagram and WhatsApp as separate companies in the same way that US telecom giant AT&T was broken up in 1982. However, breaking up Facebook is not going to do much to tone down the problematic Facebook AI algorithm.

Facebook needs to dig into its computer technology and alter the math so false and misleading stories are not served up on people’s newsfeeds in the way that they are now. This will result in Facebook making less money on ads but this is what is required.

I doubt they will take this action unless forced by the government, their advertisers, or their employees.

For now, watch Facebook continue to focus on the small things like banning ads for a few niche products or banning a very tiny percentage of their users.

If you want to dig deeper into this topic, please tune into my podcast episode on The Marketing AI Show. Paul Roetzer, founder of Marketing Artificial Intelligence Institute and creator of the Marketing AI Conference (MAICON), and I discussed “Is Facebook Evil?” for 40 minutes!

New Rules of Marketing and PR