security

Tech Discrimination: The New Way We Work? – The Solidarity Center


Ride share drivers face many job-based hazardsand for women, the dangers are compounded by sexual harassment and other forms of gender-based violence. Women app-based workers also are disproportionately targeted by what law scholar Veena Dubal has termed “algorithmic discrimination.”

“The structure of the wage-setting process, the structure of the algorithms, tends to recreate traditional forms of discrimination. Again, replicating the gender wage gap,” Dubal tells Solidarity Center Executive Director and Podcast Host Shawna Bader-Blau on the latest episode of “My Boss Is a Robot.” 

Algorithmic bosses turn the traditional employment model on its headand not in a good way.

“Uber’s own research shows that people who work longer hours actually earn less per hour,” says Dubal, a law professor, University of California, San Francisco College of Law.

“All of these sort of basic ideas about work are being disrupted invisibly by algorithmic wage setting processes that could very easily spread to other sectors of the economy, disrupting traditional ideas of how wages should be and are set, and really disconnecting work from security in a way that’s quite dystopian.”

Tech Discrimination: The New Way We Work, explores what this new model means for gig workers–and how it could shape a new world of work where how much we are paid, how many hours we will work and what our job will be day to day are completely out of our control.

My Boss Is a Robot” is a six-part series that seeks to shine a light on the behind-the-scenes practices of app companies who exploit workers in the global gig economy. Download the latest episode, Tech Discrimination: The New Way We Work? and watch for the next episode on October 25.

Readers Also Like:  iOS 16.5.1 (c): You Should Download This iPhone Update Now - CNET

Listen to this episode and all Solidarity Center episodes here or at SpotifyAmazonStitcher or wherever you subscribe to your favorite podcasts.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.