How the Facebook algorithm decides what you see

Published: October 7, 2021 5:29 PM EDT
Updated: October 7, 2021 5:48 PM EDT

Facebook has come under criticism for its algorithm, facing accusations that it helps promote harmful content. But Algorithms are complicated, filled with letters, numbers and variables that control so much of what we see online. So how do they work?

WINK News spoke with Florida Gulf Coast University mathematician Senthil Girimurugan and asked them to simplify it.”The more you give into Facebook, the more it’s going to learn about you.”

The pictures we post, the words we write, the things we like or dislike, all give Facebook an idea of who we are, what we like and what we’re most likely to interact with for the longest period of time

Senthil Girimurugan said the more you interact, “the better the machine is going to get a predicting who you are and how you would react to a given circumstance.”

Facebook’s algorithms are under fire thanks to a whistleblower who went on 60 minutes and testified before Congress that the social media giant amplifies hate, misinformation and unrest.

Girimurugan explains, “it works very similarly to how your brain neurons work. One neuron triggers another, that neuron triggers another and so forth.” He said, “you could have your profile connected to advertisements, your profile connected to political stuff or how you react on a specific opinion.”

The Facebook whistleblower said the company almost always chooses to do what’s good for Facebook, not what’s good for its users. And if those things are harmful, Girimurugan says that’s where the negative effects could come in. “The more information I put out there, the more control I provide to this particular machine that is sort of models me. I wouldn’t say control but it models me. If 100 people, if they do the exact same thing, those hundred models put information that the parent model can use to influence.”

Doctor Girimurugan told WINK News if there’s one thing he’d change about Facebook’s algorithm, it would be taking away the engagement-based ranking system for negative content so that harmful content won’t make it onto your feed.