“The style of moderation can vary from site to site, and from platform to platform, as rules around what UGC is allowed are often set at a site or platform level, and reflect that platform’s brand and reputation, its tolerance for risk, and the type of user engagement it wishes to attract.”

Sarah T. Roberts, “Content Moderation,” in Encyclopedia of Big Data, eds. Laurie A. Schintler and Connie L. McNeely (Berlin and Heidelberg: Springer, 2017), 1.

In reflecting on our readings and our discussion this week, this quote by Sarah T. Roberts stood out to me, and it brought to mind several related thoughts about user-generated content and user interactions by academics and others on online platforms.

Through my personal Facebook account, I am a member of several private groups related to my professional interests, including Friends of the International Congress on Medieval Studies (2,218 members), the International Society for the Study of Medievalism (541 members), and Teaching the Middle Ages (2,765 members). With the exception of the second group, these groups are not officially affiliated with professional organizations, though they do consist of academics and others with interests in these topics. Each group has at least one admin or moderator who is at least nominally responsible for moderating user-generated content and managing the community of members.

As Roberts has noted, the “style of moderation” of sites and platforms, and of groups hosted by platforms, can and does vary, and I am interested here in considering private Facebook groups such as those I mentioned that I am a member of as spaces in which users generate more or less “academic” or “professional” content through their personal social media accounts. These private groups can be, and have been, contentious spaces, with posts, conversations, and, at times, arguments about prejudice and discrimination faced by members of these groups, particularly medievalists of color, having resulted in reminders by the groups’ admins of acceptable behavior within the group and in tensions among members both in these online spaces and in the field more generally.

Considered more broadly, what implications and impacts might the moderation of user-generated content and user interactions on online platforms, particularly social media, have for academic communities?

With pressure to be professionally “visible” and “engaged” in online spaces, including social media platforms, as part of professional networking, gaining field recognition, and improving one’s metrics for the hiring and tenure processes—I am thinking here of our conversations and work earlier this semester with Michael Dietrich—are there ethical concerns in asking or expecting academics to build an online professional presence, particularly in regard to graduate students, early career scholars, and those who are unaffiliated, given that such a presence requires continuous work and that this work is likely to be uncompensated and unacknowledged?

2 thoughts on “User-Generated Content and Academic Communities

  1. These spaces are often monitored by individuals who intend to harm the academic trajectories of academics who do choose to voice opinions in public, informal spaces. I vividly remember the backlash against David Guth, a journalism professor at the University of Kansas, who tweeted at the National Rifle Association (NRA) after yet another mass-shooting stating that the blood of the victims was on the NRA’s hands. He was put on leave by the University and the University itself faced retaliation at the hands of Kansas lawmakers, the whole thing was a terrifying look into the precarity of academic positions in the face of digital mobs. Similarly Turning Point USA has announced open season on all professors currently giving online lectures to root out liberal indoctrination, so we will have the privilege of facing these digital mobs while contending with COVID-19 today.

  2. The case about the Medieval Studies groups on Facebook is interesting, because of the potential conflict between discussion protocols and appropriate codes of conduct. My guess is that even closed communities on FB underlie its (granted cursory) rules, but nonetheless I’d imagine that user could get in trouble for, say, racist slurs not just by the decision of a moderator but the platform itself. It’s very much a political question to ask how an online communities (or a platform) approaches the protection and well being of minority groups.

    The question about online visibility for early careers scholars hits close to home and it’s a tough one. As a graduate student I was told that it shouldn’t matter if you have an active online presence, but I can tell you that anytime I send out a job application, I can see activity on my Academia page (which loves to notify me of anything that happens in its universe). Creating an online presence is definitely a form of shadow work.

Leave a Reply