Facebook group admins are the new news editors
As organic reach of Facebook pages has declined, news organizations have migrated to Facebook groups. Oftentimes, however, they are late to the party.
In neighborhoods and towns across the country, geographically and topically focused Facebook groups have been created by non-journalists.
The Philadelphia Inquirer documented this phenomenon in a May article, From pizza to personal attacks: What it’s like to manage a Philly neighborhood Facebook group.
To a point, these Facebook group admins serve a function nearly identical to that of the now-endangered newspaper editor. But unlike journalism, which is self-regulated by a professional code, in addition to being directly regulated by libel and slander laws, Facebook group admins fulfill their roles through a disturbingly weak mandate and oftentimes a complete lack of an ethical framework.
Facebook group administration can be a thankless job.
If the job is thankless, what would motivate someone to establish such a group? Consider the blow-back York Daily Record editor Jim McClure received when sharing his column about YDR’s Fixing York group.
It takes real time and effort (not to mention, thick skin) to maintain an egalitarian and transparently moderated Facebook community.
So, many Facebook group admins simply do not – do not maintain an egalitarian community, do not transparently moderate, and do not sustain the group simply to do good. Rather, many Facebook group admins have established their communities without paying much heed to the ethical pitfalls of favoritism, transparency, or user-generated information.
These Facebook group admins can hold substantial audience power. I run a blog about my hometown in Pennsylvania; an independent Facebook group targeted at that community generates most of my traffic. I am at the mercy of the moderator to share my article.
This isn’t simply sour grapes at the loss of audience control, rather it is an a observation that not all participants play by the same rules. As Facebook watches Nextdoor’s rise, Facebook group admins have received new moderation and monetization tools – proof positive that there’s a new golden goose when it comes to meaningful interactions, to use the dystopian vernacular of Menlo Park.
Facebook understands its news function in American democracy, and these group admins set community discourse in a way congruent to the historical local newspaper editor. The group admin chooses which stories to promote, which viewpoints are legitimate, which attacks broach a standard of decency. But all these roles are filled without the sunlight so typical of publishing. The local newspaper editor was held accountable through double jeopardy – once by advertisers, once by readers. Who holds the Facebook group admin accountable?
A Facebook group in Philly – UrbanPHL – inspired this essay when I encountered opaque moderation of a comment I made regarding a post by Philadelphia real estate developer & politician Ori Feibush.
Feibush received a mixed reaction, but had active defenders in the comments (including his own retorts, of debatable efficacy). One such defender was Jon Geeting, one of seven administrators. Geeting is a former journalist, but currently works as a political consultant for dark money Philadelphia 3.0, which supported Feibush in the previous election. In the comments, Geeting visibly and verdantly argues for the Feibush perspective. Geeting’s wife also happens to serve as Feibush’s executive assistant.
After engaging on a few threads, I left a comment to the main post, suggesting that Ori may be trying (but failing) to emit “big dick energy”. Perhaps not the most tasteful turn of phrase, but widespread enough to warrant its own Vox explainer. The comment may have been borderline, but through its use of popular culture I believe it fell within typical community standards, such as you would find in a moderated comments section. But when I checked the thread a couple hours later, following a smattering of dopamine-sparking engagement, I found that the post had been ~~Deleted~~.
Updated 9:37: Geeting responded this morning stating that he deleted the comment as he deemed it a “dick-based insult” and an “ad hominem attack(s)”. Regarding the nexus between Philadelphia 3.0, Ori Feibush/OCF, and the UrbanPHL group, Geeting says that he does not comment on posts about OCF projects. He also emphasized that Philly 3.0 did not make an endorsement in the 2nd district council race in 2015.
I asked Geeting immediately and directly whether he had deleted the post, to no response. See update above.
Managing a private Facebook group of 5,000+ Philadelphians, these administrators are obligated by no code of conduct, no Facebook-required disclosures of bias, affiliation, or financial interest, and no real transparency as to what actions are being taken to moderate, censor, or otherwise direct the flow of conversation.
Although the UrbanPHL group tends to self-regulate towards fact-based journalism, not the same is true for all groups. Imagine the same dynamic at play in a group such as the Cancer Cures and Natural Healing Research Group, where 80k+ are barraged by a mixture of personal pleas and shady snake oil. What would Zuck think if he knew his role in the spread of colloidal silver as a cancer treatment?
Facebook group administrators are serving a function quite close to the historic newspaper editor, but with none of the market-driven accountability, nor industry-driven self-regulation. Fake news, slanted moderation, and perverse incentives run rampant in these new digital commons.
While bots and Russians surely inflame the disinformation debate, Facebook could still do so much more to recognize and embrace its own structural role in this emergent news & information economy, and it could start by enforcing a standard of transparency in control and moderation for its quickly-growing universe of Facebook groups.
Your ad blocker is on.
Read ad free.
Purchase a Subscription!
The team at Vox Media deserves all the snaps for its work on Chorus, the once-mythical “unicorn” content management system that does just about everything a digital publisher could want.
In a thread begun October 2016, Washington Post technology director Aram Zucker-Scharff tweeted about the shady advertising practices of EverQuote, a Boston-based startup. Since then these ads have become prolific on the web (and nearly as prolific are Aram’s tweets documenting the malfeasance).
Send this to a friend