In this excerpt from the introduction to the new book Inside the Invisible Cage: How Algorithms Control Workersauthor Hatim Rahmanassistant professor of management and organizations at the Kellogg School, describes how algorithms are used to evaluate employee performance. This often happens with little transparency, leaving employees to guess how they are rated.
The opaque nature of the algorithmic rating systems used by platforms such as TalentFinder – and the increasing reliance on them by recruiting firms – allows the platforms to screen highly skilled workers in an “invisible cage”: an environment in which organizations embed their rules and guidelines for how employees should behave in opaque algorithms that change without providing notice, explanation or recourse to employees. This marks a profound shift in the way markets and organizations attempt to categorize and ultimately control people.
Tyra nervously refreshed her browser after completing her latest project on TalentFinder and received the following feedback from her manager: “She was fast, accurate and easy to work with.” It was a succinct, positive review. But now Tyra had to wait. He clicked refresh again, waiting to see when and how TalentFinder’s algorithm would update the rating score. So much depends on the algorithm’s verdict: you get a higher salary, get noticed by more prestigious clients, and gain visibility in search results, for starters.
The problem, however, was that Tyra had no way of knowing how the algorithm controlling her visibility and success on TalentFinder was behaving. She had virtually no access to what the algorithm’s criteria were, how its criteria were weighted, or even when the algorithm would update her score. After refreshing her page for the tenth time to no avail, Tyra closed the window and thought about the unknown algorithm controlling her fate. Frustrated, she turned to express her predicament the best way she knew how, by writing a poem:
The algorithm,
No one can explain.
To try to decrypt,
It’s a futile effort.
Is it up or is it down,
With no reason in sight.
Accept it or not,
You won’t win the race.
So work and work,
Leave the mysteries.
To consider the Algorithm,
It’s a road to misery.
Tyra’s poem was no exaggeration. Experienced workers, new workers, high-scoring and low-scoring workers, workers located in different countries—all reported similarly confusing experiences with TalentFinder’s algorithm. Sometimes the algorithm increased the score rating, sometimes it decreased it, and sometimes it did nothing at all. These results had major implications for workers’ ability to find jobs on TalentFinder, but deciphering how the algorithm reached its decision was maddeningly impossible—it was, as Tyra put it, “a path to misery.”
Inside the Invisible Cage examines how organizations’ use of algorithms is reshaping our understanding of control for Tyra and the millions of other highly skilled workers who use online job market platforms (eg Upwork, TopCoder, Gigster) to find their work. The explosion of online job market platforms has changed the nature of work over the past two decades. In 2021, over forty million people used an online job platform to find work in the United States. To put that number in context, according to recent estimates, retail trade, the industry with the most employees in the United States, had 3.6 million workers. In fact, if the five largest occupations in the United States were combined, this still would not equate to as many people using online job platforms to find work.
The point is not only that many people use online job platforms to find work. It’s that these platforms have changed the way organizations and employees find and collaborate with each other. The goal of these platforms is to create an Amazon for labor: a platform that gives organizations and individual clients direct access to top talent from around the world at the click of a button. Organizations, for example, can use online job platforms to hire highly skilled workers, such as software engineers, graphic designers, data scientists, engineers, architects, and even lawyers and doctors, from around the world to complete primarily unstructured, knowledge-intensive projects. The appeal of online job market platforms has increased with the dramatic increase in remote work accelerated by the COVID-19 pandemic.
Algorithms fuel the growth of online job market platforms. Millions of participants are registered on platforms and it would be impossible for a platform to individually match every job opportunity with workers whose skills, salary and schedule match the position. Instead, these platforms use algorithms to match organizations and clients with workers, just as YouTube or Netflix use algorithms to match viewer interests with video content. However, I argue that platforms use these algorithms to do much more than match jobs with workers.
This book argues that algorithms allow platforms to control highly skilled workers within an “invisible cage”: an environment in which organizations embed the rules and guidelines of how workers should behave into opaque algorithms that change without notice. to provide notice, explanation or recourse to employees. The invisible cage gives platform organizations predictability because they can use algorithms to more efficiently collect data and monitor, evaluate and categorize which employees are rewarded and punished on a global scale. At the same time, the opaque, dynamic algorithms in the invisible cage make life more unpredictable for workers because they don’t know which actions will be rewarded or punished. I show that workers remain trapped in the invisible cage because platform organizing algorithms control a worker’s ability to find jobs on and off the platform. As a result, workers like Tyra largely see their efforts to comply with its opaque algorithms as the only option they have, even though they can theoretically leave the platform at any time. The concept of the invisible cage thus reflects how workers have to deal with an ever-changing and opaque set of algorithms that control their job opportunities and success within and across labor markets.
Platforms maintain the invisible cage by leveraging weak institutional oversight and regulations governing their operations to cultivate and exploit power and information asymmetries through covert data collection, processing and experimentation. These asymmetries have significant implications for organizations and platform workers. An important finding of this book is that algorithms can prove highly disruptive to the way workers find and complete work. This is especially true for workers with college degrees and advanced degrees—the very workers long thought to be immune to technological disruption. This argument derives primarily from six years of ethnographic data collection conducted on one of the world’s largest online job market platforms for highly skilled work, TalentFinder (pseudonym).
More broadly, the invisible cage marks a profound shift in the way markets and organizations attempt to categorize and ultimately control people. Previously, markets and organizations classified people into categories based on group-level characteristics such as education, gender, location, and age (eg women with engineering degrees in their 20s living in Chicago). However, my analysis shows that organizations can use algorithms to categorize them based on more granular individual-level data in an attempt to “know” the people in the invisible cage better than they know themselves. I show that the act of determining what algorithms should “know” about workers is an organizational decision that reveals what an organization prioritizes and what it wants to (de)value. While highly skilled workers have traditionally had some degree of control over how they are evaluated and ranked, in the invisible cage organizations use algorithms to transfer that control to themselves, while removing workers’ ability to influence or challenge the consequences of that transfer of control. . Specifically, organizations collect people data, classifying it dynamically using various algorithmic ratings, classifications and categories. Humans cannot verify what data is collected and may have difficulty understanding how or why an algorithm categorizes it in a given way. Unlike earlier forms of control used in bureaucratic organizations or market environments, the invisible cage is ubiquitous but opaque and shifting, making it difficult for workers to break free from it.
By examining the implications of using algorithms to control high-skilled labor, this book moves beyond existing scholarship, which has focused primarily on how platform algorithms facilitate lower-paid work (e.g. .Uber, Instacart and Amazon Mechanical Turk). In lower-paying environments, organizations use algorithms to nudge workers toward standardized behavior, revealing an enhanced form of Taylorism. In contrast, in the invisible cage the platform organization does not want workers to think about the algorithm or what behaviors are desired. Instead, it encourages employees to behave “naturally” so that it can “objectively” categorize, rank, and recommend employees based on their actions. By choosing which information is objective, measured, and valued, the organization’s algorithm verifies certain worker characteristics and choices while removing the complexity and unpredictability inherent in highly skilled work. So, as organizations increasingly use algorithms to make consequential decisions about the people inside the invisible cage (like deciding who can rent and buy real estate, who goes to jail, and who to hire), this form of control determines everything and more our opportunities without allowing us to understand or respond to the factors that govern our success.