Read the Manuscript | Join Worship | Get the Discussion Guide

If you scroll on Facebook for very long, you come across bland Christian memes. Maybe your uncle or your sister or a friend from the church where you grew up shared it. Some page with a name like “Jesus is Lord” or “God is Good”, and then an equally bland meme – a field of flowers with a verse about God’s goodness or trustworthiness. 

I don’t know about you, but I never really pay much attention to these posts, let alone the pages that post them. So I was shocked when, in the lead up to the 2020 election, journalists published a leaked internal Facebook report that showed 19 of the top 20 Christian pages had been created by Eastern European troll farms as a way to influence American voters. 

That’s right – the pages were created specifically to get Christians to like the pages and reshare the content. Then they would sprinkle in carefully crafted propaganda messages designed to exert unconscious influence on those who’d liked the page.

The report was damning – the author of the report observed that these troll farms were so successful because they exploited Facebook’s algorithms. Far from preventing its users from exploitation, the way Facebook decided what shows up in our feed actually made us more vulnerable to exploitation.

We received another shock the following year when a hack of the political consulting firm Cambridge Analytical revealed that Facebook allowed third party companies (like political consulting firms) unauthorized access to users’ data. Facebook sold access to tens of millions of users’ data, and when the ensuing lawsuit finally resolved, they settled for $725 million without having to admit any wrongdoing on their part.

All of that is infuriating at a macrolevel, and when we remember the 2020 election and the role misinformation played – and continues to play – in efforts to undermine our democracy, $725 million doesn’t feel like enough. 

But the Facebook fiasco raises a more existential question about social media, namely that most of us treat our feeds as morally neutral. But they’re not. Our feeds are curated for us, controlled by invisible algorithms we never see or choose. And that means that what we see, what we watch, is being chosen by someone. 

What does a faithful engagement with that reality look like? How can we be wise about what we’re consuming, and who’s choosing that for us?

Join us Sunday as we explore what it looks like to take responsibility for what’s forming us!

Recommended Posts