New Work, New Vulnerabilities: An Analysis of Biases in New Forms of Internet-based Work and Potential Solutions

Published on

Who is the best candidate? A hiring algorithm has its answer. Don’t want to do chores? Pull up an app and find someone to do it for you soon. In recent years, technological development has brought about revolutionary changes in employment, fundamentally changing the hiring process. The increase in such application of technology has given rise to the burgeoning gig-economy platforms, which characterizes abundant peer-to-peer hiring opportunities for various jobs, such as driving, cleaning, and cooking. At a first glance, it seems that this development encourages equal employment opportunities, either by allowing more direct access to job openings or eliminating subjective biases. However, a closer look at the new forms of employment reveals how workers or potential employees are subject to biases and even discrimination that they would not be exposed to in a more traditional form of hiring. In this essay, I will classify and discuss different aspects of the Internet-based work and hiring systems that make workers vulnerable to biases and attacks from strangers or the company. In the later part of the paper, I will also put forward some potential solutions that may help solve the issues we are facing today that arouse from the new recruiting methods.

To begin with, although a lot of gig platforms advertise themselves with the idea that by using their platforms, workers are having the experience of being individual entrepreneurs, the truth is that contrary to creating a nice entrepreneurial experience for workers, these companies tend to care much more about creating a pleasant experience for customers using their platform. Since the majority of gig economy platforms regard their workers as individual contractors instead of official employees, they can avoid a lot of employer liability present in traditional labor organizations. Therefore, as Alexandrea J. Ravenelle pointed out, a gig-work company’s decision making most likely reflects its own interests, and the important grey-area decision making is often left to the workers themselves, a large number of whom, unfortunately, do not have relatively high education background and thus are not always the best decision makers in these scenarios.

For example, in her book Hustle and Gig, Alexandrea J. Ravenelle points out that ride-hailing app drivers who are attacked by passengers may be less likely to report the incident after the experience of Artur Zawada, a driver who was removed from the Uber platform after a University of Michigan student verbally assaulted him with a tirade of antigay slurs. These platforms also put great efforts into ensuring users’ anonymity to guarantee safety. However, this anonymity may also encourage usage of such platforms that makes workers uncomfortable, or, in some cases, even illegal. Such usage puts workers’ well-being at risk. Ravenelle points out in her book how some people may be using shared-rides to help with their drug-dealing business, or are finding Taskrabbit (a platform to hire workers to do chores) taskers get items that they themselves find embarrassing to buy.  It seems to be a norm that workers acquiesce to their situation of having to deal with workplace troubles on their own, without help or support from the company.

What’s more, for gender minority, the company’s partiality also makes the situation of sexual harassment harder to track down and harder to handle properly. An unsettling phenomenon we see now is that since workers for the gig-economy platform regard conducts that involve sexual harassment as “temporary”, they do not tend to report or take them seriously. Some may even be blinded by the “trust” ideal of peer-to-peer hiring platforms and fail to identify the harassing nature of some of these conducts (Hustle and Gig). Given the worker-customer bias of the company mentioned above, these workers remain silent perhaps because they do not harbor high hope for the company to be supportive. Considering that they still have to rely on the platform for income, they may simply choose to keep the unfair experience to themselves and avoid spending what is likely ineffective effort. It is upsetting to see how the power asymmetry in these platforms can leave workers vulnerable: depriving them of their proactivity in thinking for themselves.

On a more individual level, we can see that individual customers’ biases or even discrimination towards certain groups can be magnified through these online platforms and negatively affect gig economy workers. Some may claim that online platforms can “act as neutral intermediaries that preclude human biases”, but the reality is often more complicated. The most prevalent operation type for gig platforms is where users get to choose who to hire. Biases and inequalities could potentially be shown directly on this kind of platform, since workers receive direct reply or feedback from potential customers. This can be shown in forms such as lower prices, fewer responses, and lower ratings. For example, research has shown that on Fiverr (a platform for virtual gig-work such as web-design), the indication of Asian or Black is correlated to lower ratings and Asian men receive much fewer reviews on TaskRabbit than white men .

Notably, the anonymity in the online world makes the minority groups more subject to attacks or biases by strangers. We should note that in real life or on social media where one’s conduct may be judged by one’s acquaintance, there is a moral consensus that prevents people from, at least openly, carrying out discriminating conduct. However, the consequences of this type of conduct shown in a gig-economy platform are at best limited, and often non-existent. In fact, workers’ tendency of choosing to opt out of publicly displaying their racial and gender identities implies that these factors influence their chances of getting the job. For TaskRabbit, since customers will ultimately know the gender and race of a tasker after meeting the tasker in person, there is no anonymity in these fields. Therefore, it is no surprise that almost all workers have a clear headshot uploaded . On the other hand, many workers take advantage of the anonymity offered by Fiverr and do not upload a picture that depicts a person (29%) or do not upload a picture at all (12%) . This contrast demonstrates that it is already an acknowledged phenomenon that gender and race may be the reason for unequal work opportunities.

Given that the online economy is an undeniable trend and sometimes the last resort for people to get a job , it is important to consider how gig economy platforms can potentially reinforce biases in society and rethink the question of how we can bring about true equality in the age of new forms of labor. I contend that there are some possible ways to improve the situation, from the perspectives of platform operators as well as the government.

An important aspect that platforms should consider incorporating into their system upgrade and reform is regulating requests or improving request filters so that workers don’t end up in situations where they feel uncomfortable. A lot of gig-economy workers have had the experience of carrying out tasks that make them uncomfortable. Most of these platforms have a request-accepting system that requires workers to accept requests as quickly as possible so that the work opportunity does not go to others. A more intelligent and efficient system to help workers sort out requests is therefore especially important. The turn-down-request right should not have consequences in workers’ future work: they should not be prevented from future opportunities or have bad records because they turn down tasks that they would not be comfortable performing.

For platforms that automatically dispatch requests, workers should also be given the right to reject some of these assigned tasks. Admittedly, convenience, which lies at the heart of these platforms, can be greatly compromised if workers can freely reject dispatched requests. Therefore, a reasonable way to start the process could be implementing a limit for a certain period, and require workers to explain for each case why the decision is made. The limit would serve as a middle ground for balancing the interests on both sides. This information can also be used to train and improve the company’s system that can potentially fundamentally reduce the likelihood of improper requests getting posted on the platform as well as the frequency of workers’ encountering disturbing requests. For example, past rejected requests could be grouped in categories by computer algorithm, and workers that have rejected multiple requests in one category can be excluded from being dispatched this type of request in the future. The system can also help find users that may be violating the use of such platforms. Algorithms can thus be used to analyze patterns of customers’ uses and serve as evidence if further investigation is necessary.

Platform operators should also consider reinforcing the both-way rating system. This can make customers more aware of the fact that they are not behaving without record or evaluations from others. The anonymity of such ratings should also be guaranteed so that workers can feel more comfortable about reporting unpleasant work experience. Although this attempt will not address the issue of bias fundamentally, it can reduce the frequency of such public displays.

On the side of government, there should be more explicit requirements that help ensure that companies are working actively to protect not only customer rights, but also workers’ rights. There should be more legislative regulation about the relationship between individual contractors and companies. This could serve as a middle-ground so that even if the employer status dispute cannot be resolved sometime soon, workers are still granted more rights and protection than they currently have. The government should also make relevant updates to anti-discrimination law in both the traditional and online marketplace. Currently, women’s requested hourly wage is almost 40% lower than that of men, even after controlling for feedback score, experience, occupational category, hours of work, and educational attainment. The authority’s stronger protection on equal opportunities and equal pay will help these women feel more confident in recognizing their own work and rights as well. Such implementation would also help address the sexual harassment issue mentioned earlier. Therefore, it is important for us to take immediate actions that can help workers see their worth and be more conscious about their own rights.

In conclusion, we see how the online environment and modern artificial intelligence technology are reshaping the world of hiring and employment. In this new environment, workers can also be vulnerable through new forms of biases, attacks, or even discrimination under the system. These biases and attacks come from individual users and the platform itself. This article also puts forward some potential ways of better supporting workers or potential workers in this new environment. Since the new forms of hiring discussed in the article are general trends in our society today, it is important to study the various sources of workers’ vulnerability in work, and rethink the question of how we can bring about true equality in the age of gig-economy and machine learning.

More posts by Harvard GRC.
New Work, New Vulnerabilities: An Analysis of Biases in New Forms of Internet-based Work and Potential Solutions
Share
Twitter icon Facebook icon