Your boss is watching
www.technologyreview.com
A full days work for Dora Manriquez, who drives for Uber and Lyft in the San Francisco Bay Area, includes waiting in her car for a two-digit number to appear. The apps keep sending her rides that are too cheap to pay for her time$4 or $7 for a trip across San Francisco, $16 for a trip from the airport for which the customer is charged $100. But Manriquez cant wait too long to accept a ride, because her acceptance rate contributes to her driving score for both companies, which can then affect the benefits and discounts she has access to. The systems are black boxes, and Manriquez cant know for sure which data points affect the offers she receives or how. But what she does know is that shes driven for ride-share companies for the last nine years, and this year, having found herself unable to score enough better-paying rides, she has to file for bankruptcy. Every action Manriquez takesor doesnt takeis logged by the apps she must use to work for these companies. (An Uber spokesperson told MIT Technology Review that acceptance rates dont affect drivers fares. Lyft did not return a request forcomment on the record.) But app-based employers arent the only ones keeping a very close eye on workers today. A study conducted in 2021, when the covid-19 pandemic had greatly increased the number of people working from home, revealed that almost 80% of companies surveyed were monitoring their remote or hybrid workers. A New York Times investigation in 2022 found that eight of the 10 largest private companies in the US track individual worker productivity metrics, many in real time. Specialized software can now measure and log workers online activities, physical location, and even behaviors like which keys they tap and what tone they use in their written communicationsand many workers arent even aware that this is happening. Whats more, required work apps on personal devices may have access to more than just workand as we may know from our private lives, most technology can become surveillance technology if the wrong people have access to the data. While there are some laws in this area, those that protect privacy for workers are fewer and patchier than those applying to consumers. Meanwhile, its predicted that the global market for employee monitoring software will reach $4.5 billion by 2026, with North America claiming the dominant share. Working todaywhether in an office, a warehouse, or your carcan mean constant electronic surveillance with little transparency, and potentially with livelihood-ending consequences if your productivity flags. What matters even more than the effects of this ubiquitous monitoring on privacy may be how all that data is shifting the relationships between workers and managers, companies and their workforce. Managers and management consultants are using worker data, individually and in the aggregate, to create black-box algorithms that determine hiring and firing, promotion and deactivation. And this is laying the groundwork for the automation of tasks and even whole categories of labor on an endless escalator to optimized productivity. Some human workers are already struggling to keep up with robotic ideals. We are in the midst of a shift in work and workplace relationships as significant as the Second Industrial Revolution of the late 19th and early 20th centuries. And new policies and protections may be necessary to correct the balance of power. Data as power Data has been part of the story of paid work and power since the late 19th century, when manufacturing was booming in the US and a rise in immigration meant cheap and plentiful labor. The mechanical engineer Frederick Winslow Taylor, who would become one of the first management consultants, created a strategy called scientific management to optimize production by tracking and setting standards for worker performance. Soon after, Henry Ford broke down the auto manufacturing process into mechanized steps to minimize the role of individual skill and maximize the number of cars that could be produced each day. But the transformation of workers into numbers has a longer history. Some researchers see a direct line between Taylors and Fords unrelenting focus on efficiency and the dehumanizing labor optimization practices carried out on slave-owning plantations. As manufacturers adopted Taylorism and its successors, time was replaced by productivity as the measure of work, and the power divide between owners and workers in the United States widened. But other developments soon helped rebalance the scales. In 1914, Section6 of the Clayton Act established the federal legal right for workers to unionize and stated that the labor of a human being is not a commodity. In the years that followed, union membership grew, and the 40-hour work week and the minimum wage were written into US law. Though the nature of work had changed with revolutions in technology and management strategy, new frameworks and guardrails stood up to meet that change. More than a hundred years after Taylor published his seminal book, The Principles of Scientific Management, efficiency is still a business buzzword, and technological developments, including new uses of data, have brought work to another turning point. But the federal minimum wage and other worker protections havent kept up, leaving the power divide even starker. In 2023, CEO pay was 290 times average worker pay, a disparity thats increased more than 1,000% since 1978. Data may play the same kind of intermediary role in the boss-worker relationship that it has since the turn of the 20th century, but the scale has exploded. And the stakes can be a matter of physical health. In 2024, a report from a Senate committee led by Bernie Sanders, based on an 18-month investigation of Amazons warehouse practices, found that the company had been setting the pace of work in those facilities with black-box algorithms, presumably calibrated with data collected by monitoring employees. (In California, because of a 2021 bill, Amazon is required to at least reveal the quotas and standards workers are expected to comply with; elsewhere the bar can remain a mystery to the very people struggling to meet it.) The report also found that in each of the previous seven years, Amazon workers had been almost twice as likely to be injured as other warehouse workers, with injuries ranging from concussions to torn rotator cuffs to long-term back pain. An internal team tasked with evaluating Amazon warehouse safety found that letting robots set the pace for human labor was correlated with subsequent injuries. The Sanders report found that between 2020 and 2022, two internal Amazon teams tasked with evaluating warehouse safety recommended reducing the required pace of work and giving workers more time off. Another found that letting robots set the pace for human labor was correlated with subsequent injuries. The company rejected all the recommendations for technical or productivity reasons. But the report goes on to reveal that in 2022, another team at Amazon, called Core AI, also evaluated warehouse safety and concluded that unrealistic pacing wasnt the reason all those workers were getting hurt on the job. Core AI said that the cause, instead, was workers frailty and intrinsic likelihood of injury. The issue was the limitations of the human bodies the company was measuring, not the pressures it was subjecting those bodies to. Amazon stood by this reasoning during the congressional investigation. Amazon spokesperson Maureen Lynch Vogel told MIT Technology Review that the Sanders report is wrong on the facts and that the company continues to reduce incident rates for accidents. The facts are, she said, our expectations for our employees are safe and reasonableand that was validated both by a judge in Washington after a thorough hearing and by the states Board of Industrial Insurance Appeals. A study conducted in 2021 revealed that almost 80% of companies surveyed were monitoring their remote or hybrid workers. Yet this line of thinking is hardly unique to Amazon, although the company could be seen as a pioneer in the datafication of work. (An investigation found that over one year between 2017 and 2018, the company fired hundreds of workers at a single facilityby means of automatically generated lettersfor not meeting productivity quotas.) An AI startup recently placed a series of billboards and bus signs in the Bay Area touting the benefits of its automated sales agents, which it calls Artisans, over human workers. Artisans wont complain about work-life balance, one said. Artisans wont come into work hungover, claimed another. Stop hiring humans, one hammered home. The startups leadership took to the company blog to say that the marketing campaign was intentionally provocative and that Artisan believes in the potential of human labor. But the company also asserted that using one of its AI agents costs 96% less than hiring a human to do the same job. The campaign hit a nerve: When data is king, humanswhether warehouse laborers or knowledge workersmay not be able to outperform machines. AI management and managing AI Companies that use electronic employee monitoring report that they are most often looking to the technologies not only to increase productivity but also to manage risk. And software like Teramind offers tools and analysis to help with both priorities. While Teramind, a globally distributed company, keeps its list of over 10,000 client companies private, it provides resources for the financial, health-care, and customer service industries, among otherssome of which have strict compliance requirements that can be tricky to keep on top of. The platform allows clients to set data-driven standards for productivity, establish thresholds for alerts about toxic communication tone or language, create tracking systems for sensitive file sharing, and more. An AI startup recently placed a series of billboards and bus signs in the Bay Area touting the benefits of its automated sales agents, which it calls Artisans, over human workers.JUSTIN SULLIVAN/GETTY IMAGES With the increase in remote and hybrid work, says Teraminds chief marketing officer, Maria Osipova, the companys product strategy has shifted from tracking time spent on tasks to monitoring productivity and security more broadly, because thats what clients want. Its a different set of challenges that the tools have had to evolve to address as were moving into fully hybrid work, says Osipova. Its this transition from Do people work? or How long do they work? to How do they work best? How do we as an organization understand where and how and under what conditions they work best? And also, how do I de-risk my company when I give that amount of trust? The clients myriad use cases and risks demand a very robust platform that can monitor multiple types of input. So think about what applications are being used. Think about being able to turn on the conversations that are happening on video or audio as needed, but also with a great amount of flexibility, says Osipova. Its not that its a camera thats always watching over you. Selecting and tuning the appropriate combination of data is up to Teraminds clients and depends on the size, goals, and capabilities of the particular company. The companies are also the ones to decide, based on their legal and compliance requirements, what measures to take if thresholds for negative behavior or low performance are hit. But however carefully its implemented, the very existence of electronic monitoring may make it difficult for employees to feel safe and perform well. Multiple studies have shown that monitoring greatly increases worker stress and can break down trust between an employer and its workforce. One 2022 poll of tech workers found that roughly half would rather quit than be monitored. And when algorithmic management comes into the picture, employees may have a harder time being successfuland understanding what success even means. Ra Criscitiello, deputy director of research at SEIUUnited Healthcare Workers West, a labor union with more than 100,000 members in California, says that one of the most troubling aspects of these technological advances is how they affect performance reviews. According to Criscitiello, union members have complained that they have gotten messages from HR about data they didnt even know was being collected, and that they are being evaluated by algorithmic models they dont understand. Dora Manriquez says that when she first started driving for ride-share companies, there was an office to go to or call if she had any issues. Now, she must generally lodge any complaints by text through the app, and any response appears to come from an automated system. Sometimes theyll even get stuck, she says of the chatbots. Theyre like, I dont understand what youre saying. Can you repeat that again? Many app-based workers live in fear of being booted off the platform at any moment by the ruling algorithmsometimes with no way to appeal to a human for recourse. Veronica Avila, director of worker campaigns for the Action Center for Race and Economy (ACRE), has also seen algorithmic management take over for human supervisors at companies like Uber. More than the traditional Im watching you work, its become this really sophisticated mechanism that exerts control over workers, she says. ACRE and other advocacy groups call whats happening among app-based companies a deactivation crisis because so many workers live in fear that the ruling algorithm will boot them off the platform at any moment in response to triggers like low driver ratings or minor traffic infractionsoften with no explicit explanation and no way to appeal to a human for recourse. Ryan Gerety, director of the Athena Coalition, whichamong other activitiesorganizes to support Amazon workers, says that workers in those warehouses face continuous monitoring, assessment, and discipline based on their speed and their performance with respect to quotas that they may or may not know about. (In 2024, Amazon was fined in California for failing to disclose quotas to workers who were required to meet them.) Its not just like youre monitored, Gerety says. Its like every second counts, and every second you might get fired. MICHAEL BYERS Electronic monitoring and management are also changing existing job functions in real time. Teraminds clients must figure out who at their company will handle and make decisions around employee data. Depending on the type of company and its needs, Osipova says, that could be HR, IT, the executive team, or another group entirelyand the definitions of those roles will change with these new responsibilities. Workers tasks, too, can shift with updated technology, sometimes without warning. In 2020, when a major hospital network piloted using robots to clean rooms and deliver food to patients, Criscitiello heard from SEIU-UHW members that they were confused about how to work alongside them. Workers certainly hadnt received any training for that. Its not Were being replaced by robots, says Criscitiello. Its Am I going to be responsible if somebody has a medical event because the wrong tray was delivered? Im supervising the robotits on my floor. A New York Times investigation in 2022 found that eight of the 10 largest US private companies track individual worker productivity metrics, often in real time. Nurses are also seeing their jobs expand to include technology management. Carmen Comsti of National Nurses United, the largest nurses union in the country, says that while management isnt explicitly saying nurses will be disciplined for errors that occur as algorithmic tools like AI transcription systems or patient triaging mechanisms are integrated into their workflows, thats functionally how it works. If a monitor goes off and the nurse follows the algorithm and its incorrect, the nurse is going to get blamed for it, Comsti says. Nurses and their unions dont have access to the inner workings of the algorithms, so its impossible to say what data these or other tools have been trained on, or whether the data on how nurses work today will be used to train future algorithmic tools. What it means to be a worker, manager, or even colleague is on shifting ground, and frontline workers dont have insight into which way itll move next. The state of the law and the path to protection Today, there isnt much regulation on how companies can gather and use workers data. While the General Data Protection Regulation (GDPR) offers some worker protections in Europe, no US federal laws consistently shield workers privacy from electronic monitoring or establish firm guardrails for the implementation of algorithm-driven management strategies that draw on the resulting data. (The Electronic Communications Privacy Act allows employers to monitor employees if there are legitimate business reasons and if the employee has already given consent through a contract; tracking productivity can qualify as a legitimate business reason.) But in late 2024, the Consumer Financial Protection Bureau did issue guidance warning companies using algorithmic scores or surveillance-based reports that they must follow the Fair Credit Reporting Actwhich previously applied only to consumersby getting workers consent and offering transparency into what data was being collected and how it would be used. And the Biden administrations Blueprint for an AI Bill of Rights had suggested that the enumerated rights should apply in employment contexts. But none of these are laws. So far, binding regulation is being introduced state by state. In 2023, the California Consumer Privacy Act (CCPA) was officially extended to include workers and not just consumers in its protections, even though workers had been specifically excluded when the act was first passed. That means California workers now have the right to know what data is being collected about them and for what purpose, and they can ask to correct or delete that data. Other states are working on their own measures. But with any law or guidance, whether at the federal or state level, the reality comes down to enforcement. Criscitiello says SEIU is testing out the new CCPA protections. Its too early to tell, but my conclusion so far is that the onus is on the workers, she says. Unions are trying to fill this function, but theres no organic way for a frontline worker to know how to opt out [of data collection], or how to request data about whats being collected by their employer. Theres an education gap about that. And while CCPA covers the privacy aspect of electronic monitoring, it says nothing about how employers can use any collected data for management purposes. The push for new protections and guardrails is coming in large part from organized labor. Unions like National Nurses United and SEIU are working with legislators to create policies on workers rights in the face of algorithmic management. And app-based advocacy groups have been pushing for new minimum pay rates and against wage theftand winning. There are other successes to be counted already, too. One has to do with electronic visit verification (EVV), a system that records information about in-home visits by health-care providers. The 21st Century Cures Act, signed into law in 2016, required all states to set up such systems for Medicaid-funded home health care. The intent was to create accountability and transparency to better serve patients, but some health-care workers in California were concerned that the monitoring would be invasive and disruptive for them and the people in their care. Brandi Wolf, the statewide policy and research director for SEIUs long-term-care workers, says that in collaboration with disability rights and patient advocacy groups, the union was able to get language into legislation passed in the 20172018 term that would take effect the next fiscal year. It indicated to the federal government that California would be complying with the requirement, but that EVV would serve mainly a timekeeping function, not a management or disciplinary one. Today advocates say that individual efforts to push back against or evade electronic monitoring are not enough; the technology is too widespread and the stakes too high. The power imbalances and lack of transparency affect workers across industries and sectorsfrom contract drivers to unionized hospital staff to well-compensated knowledge workers. Whats at issue, says Minsu Longiaru, a senior staff attorney at PowerSwitch Action, a network of grassroots labor organizations, is our countrys moral economy of workthat is, an economy based on human values and not just capital. Longiaru believes theres an urgent need for a wave of socially protective policies on the scale of those that emerged out of the labor movement in the early 20th century. Were at a crucial moment right now where as a society, we need to draw red lines in the sand where we can clearly say just because we can do something technological doesnt mean that we should do it, she says. Like so many technological advances that have come before, electronic monitoring and the algorithmic uses of the resulting data are not changing the way we work on their own. The people in power are flipping those switches. And shifting the balance back toward workers may be the key to protecting their dignity and agency as the technology speeds ahead. When we talk about these data issues, were not just talking about technology, says Longiaru. We spend most of our lives in the workplace. This is about our human rights. Rebecca Ackermann is a writer, designer, and artist based in San Francisco.
0 Commentarii ·0 Distribuiri ·78 Views