As the power of digital platforms grows, tensions between platform companies and their users are increasingly making headlines. Prominent content creators have been “deplatformed” for spreading misinformation and hate speech. Uber and Lyft drivers frequently organize to resist exploitative policies.
These and other controversies are rooted in the fact that every platform company has to balance its own interests with the interests of its varied users. This is often called the problem of “platform governance,” or how platforms control the terms of users’ participation.
Existing studies of platform governance emphasize how platforms’ rules are set, implemented, and enforced. One strand of research emphasizes the role of algorithms and digital interfaces, which dictate how users interact with content and with each other. Another focuses on the human labor involved in implementing policies pertaining to user-generated content.
Most of this research has emphasized the impersonal and procedural aspects of how platform companies govern their users. Studies show that these elements of platform governance can leave users feeling frustrated and neglected, to the point where they may exit the platform. Nobody likes to be treated like a number or a data point—people like Uber drivers want the platforms they use to address the problems they create. But the existing research doesn’t have much lot to say about how platforms respond to user dissatisfaction.
Drawing on political theory, we view governance as encompassing not just regulations and procedures, but also “the means used to shape or change the behavior of people in order to achieve goals.” In a recent article, we make the case that platform researchers should take a more expansive view of governance that accounts for how platforms try to stop user exit and incorporate user voice into their procedures.
Applying this insight to platform governance draws our attention to processes beyond algorithms and rules, and toward the relationships that emerge between agents of the platform and its users.
Drawing on ethnographic research at two platform companies, we discuss how company representatives engage in interpersonal interactions with users to aid users and persuade them to stay engaged with the platform. We call this “relationship labor” and the agents who do this “relationship workers.”
We found that relationship labor was an important dimension of platform governance in two very different platforms. One of our case studies, AllDone, was a venture capital-backed, Silicon Valley-based startup that ran a digital platform connecting for buyers and sellers of local services (e.g. house cleaners, plumbers, math tutors). The other was edX, a (formerly) nonprofit startup in Cambridge, Massachusetts that partners with institutions to offer online courses to learners around the world.
We found that at both companies, relationship workers engaged in what we call account management and community management. However, the frequency of these practices varied across the two platforms: while AllDone’s experiment with community management was brief, it was a core component of edX’s governance strategy.
Account Management
Account management practices are aimed at addressing users’ particular problems and concerns. For example, at AllDone, sellers were often frustrated when they paid to submit quotes to potential buyers, but the buyers never responded.
AllDone employed phone support agents to speak with disgruntled users. One technique that agents deployed to address user issues was what they called “tough love,” or persuading users that they were responsible for their own poor outcomes. For example, as one phone support agent said to a seller who was upset about not getting hired for jobs, “Would you respond if you received that quote [that you submitted to a buyer]? You have to work on this. The leads will not work themselves.”
Agents also educated and counseled sellers who needed help understanding how to succeed on AllDone and offered personalized attention aimed at making sellers feel that the company cared about them and their businesses. Some phone calls stretched on over 45 minutes as agents made sure that the user didn’t hang up still feeling angry with AllDone.
At edX, tensions arose when users, such as course instructors, felt that the platform’s features and policies failed to meet their needs. Account management was handled by program managers, who advised course instructors on how to use the edX software by directing them to the appropriate documentation, bringing their requests to the attention of edX’s engineering team, and helping users temper their expectations and adjust to the limitations of the software.
At both platforms, account management practices were aimed at maintaining user engagement by addressing users’ individualized grievances.
Community Management
Community management practices are designed to bring users together to collaborate with one another. AllDone experimented with community management when it created an invite-only Facebook group for its most active sellers to “ask questions related to your business and AllDone, share tips, and network.”
Managers at AllDone wanted this to be a place where sellers would come together around common interests. But the majority of messages sellers posted ended up being complaints about their experiences with the platform—and they were directed not to other sellers, as AllDone had intended, but instead to AllDone management. Some sellers posted complaints like “I don’t like the change they made to the dashboard,” and “Why can’t we get feedback on why we aren’t being hired?” AllDone management would respond by discouraging members from using the forum to register complaints, telling them that instead they should be “sharing tips” and “asking questions that may benefit other service professionals.”
At edX, community management was more successful and more important to the company’s governance strategy. As part of their everyday responsibilities, edX’s program managers created opportunities for users to collaborate with each other to achieve common goals, under the careful supervision of edX management. Program managers created and managed communicative forums to help instructors and researchers at partner institutions collaborate with each other. These included both online spaces like mailing lists and discussion forums, as well as in-person venues like conferences and meetups.
In addition to creating spaces for collaboration, program managers directly forged connections between particular participants when they discovered that users shared common interests and goals. As Emily, an edX program manager put it: “Sometimes our job is really facilitating connections [and] leveraging partnership opportunities, even if it’s just between edX and a partner or if it’s between other partners who have the potential to collaborate and be stronger together than they could be [alone].”
Making Platforms Work
In two very different platforms, we observed common practices of account management aimed at addressing individual users’ concerns, and community management designed to foster collaboration between users. This suggests that relationship labor is a significant yet overlooked component of “algorithmic management,” and of platform governance more broadly. Without relationship labor, many algorithmic systems simply wouldn’t deliver on their promise to users.
The different frequencies with which AllDone and edX used account management and community management practices indicate that strategies of relationship labor vary across organizational contexts. In fact, we found that there is nothing inevitable about the nature and quality of this work. Relationship labor may be handled by low-wage contractors working from their homes far from corporate headquarters, as in AllDone, or they may be well-compensated employees who work alongside a platform’s software engineers and are incorporated into an organization’s design processes, as in edX. Our case studies suggest that a platform company’s mission, resources, and the attributes of its users can shape how it organizes relationship labor.
Relationship labor may represent a set of emerging occupations through which women will gain additional representation in tech companies. However, these roles are also likely to reinforce existing patterns of occupational gender segregation in the industry, where men tend to be hired into technical roles and women are more often represented in roles involving care, maintenance, and coordination.
Our comparative ethnography shows that relationship labor is an important component of platform governance when it comes to keeping users engaged, incorporating their voice, and preventing their exit. As platforms are forced to generate revenue rather than relying on speculative venture capital investments, they may end up relying even more on relationship labor to mollify dissatisfied users.
Read more
Benjamin Shestakofsky and Shreeharsh Kelkar. “Making Platforms Work: Relationship Labor and the Management of Publics.” Theory and Society 2020.
Image: stockvault (Creative Commons – CC0)
No Comments