To succeed on social media, you must hustle. Journalists and advertisers cleave to search engine optimized keywords, and make byzantine calculations that determine when, exactly, their articles or clients are most likely to break through the clutter and win the algorithmic lottery. Online creators and influencers, whether they’re on Facebook or Instagram or YouTube or Twitter are no different. In fact, they may be more algorithm-obsessed. Entreaties for likes and retweets and follow-backs and to “smash that subscribe button” are now so universal and frequent that they’re beginning to sound like tics. The reason for influencers’ begging, if they give one, is always the same: the algorithm.
Despite the widespread strategizing, platforms frown on some of this behavior, calling it “gaming the algorithm” or “manipulation.” But often what platforms consider “gaming” can be anything from deploying bot networks to influencers agreeing to join “engagement pods” to like and comment on each others’ posts. Yet the labels—and judgements—placed on these activities persist, having been adopted by regular users, journalists (including myself), and academics alike. Trying to use the algorithm to one’s advantage is seen as bad because the platforms say it is. But maybe it shouldn’t be.
According to a new study, Google and Facebook (and hence, YouTube and Instagram) often talk about users vying for visibility in ways that are confusing and, at times, even hypocritical. These platforms set up systems to optimize use, then chastise people for using them too well. “The line between legitimate strategic action to boost visibility and illegitimate is nebulous and shifts a lot,” says Caitlin Petre, one of the study’s co-authors and a researcher at Rutgers University who looks at the impact of algorithms on society. In the view of the study’s authors, moving the goalposts and then condemning the players is a way to make sure Facebook and Google remain the game’s all-powerful referees.
The language that platforms and the media use to talk about signal-boosting strategies falls into three major categories: chemical, criminal, and sports. Petre, along with coauthors Brooke Erin Duffy and Emily Hund, found Google has often employed the terms “organic” or “authentic” to describe behavior they find appropriate: posting, and then letting Google have its way with your work. Companies like Genius, a lyrics site that offered bloggers promotion on their social channels if those bloggers linked to Genius in their posts, have been demoted in search for “a pattern of unnatural, artificial, deceptive, or manipulative links.”
It isn’t just Google framing people who use SEO-boosting practices as deviant or “schemers,” either. Media coverage of influencers using SEO strategy or comment automation tends to describe them as “offenders,” and their techniques as tricks or scams, prompting “crack-down” and “punishment” from platforms. It’s much the same with the sports metaphors. Platforms talk about “gaming,” and some journalists have run with that language, comparing bots to injecting steroids or “playing dirty” and demotion in the algorithm to time in the penalty box. “Communities and some users accept this moralistic framing, too,” Petre says. “Some photographers I spoke to thought comment pods were untoward, since their work should be artisanal. Others are like, ‘This is bullshit and we’re doing what we need to do to survive.’” Point being: A platform’s choice of language and metaphor is often taken up by the public.
“Communities and some users accept this moralistic framing, too,” Petre says. “Some photographers I spoke to thought comment pods were untoward, since their work should be artisanal. Others are like, ‘This is bullshit and we’re doing what we need to do to survive.’”
Trouble is, that language is often inconsistent. Take Facebook’s stance on “engagement bait.” In April 2017, the company announced that it was cracking down on clickbait headlines and demoting them in the algorithm. Instead of luring in readers with promises that would never be met, Facebook suggested publishers try “calls to action” instead. Then, in December of the same year, Facebook announced they’d be demoting posts using “engagement bait,” which is how they described goading users into engagement with, well, “calls to action.” “You could see the line move,” Petre says.
Of course, platforms must have rules and can describe them however they wish. Some people really do abuse the algorithms that control what ends up on peoples’ screens. When many people talk about “gaming the algorithm,” they’re referring to the actions of extremists on 8chan and other fringe sites, who band together to make hateful or conspiracy theory content trend on Twitter or YouTube. But the mechanism for pulling off those actions—getting a bunch of people together to say the same thing at the same time, and choosing those keywords carefully—is not much different than how folks would draw attention to anything online, whether it’s a band or a political candidate or a hashtag like #MeToo. The action is neutral; the content is not, at least not to the platforms.