
Several years ago, while researching a novel, I took a job in the warehouse of a major retail chain, spending six months on the floor to better understand the lives of low-wage workers. As someone who supported the Fight for $15 campaign, I anticipated that the biggest frustration among employees would be the low pay—starting wages were just $12.25 an hour. But what truly stood out to me wasn’t the hourly rate; it was the instability of the schedule. Like many retail operations in the U.S., the company relied on just-in-time scheduling, using real-time data on customer traffic to determine staffing levels on a daily basis.
This system left workers including me with no way to predict how many hours we’d be assigned each week. One week might bring just a four-hour shift; the next, 30 hours. Paychecks varied wildly, making it nearly impossible for employees to budget, save, or qualify for basic financial commitments like a car loan or a rental lease. Many lacked reliable transportation, walking to the store in harsh weather conditions for shifts that often began as early as 4 a.m.
Even more frustrating, in many states, workers must report their hours to qualify for public assistance programs like Medicaid or SNAP. When employers cut hours through no fault of the worker—it can jeopardize access to these essential benefits. Human resources would often suggest that the way to secure more hours was to widen one’s availability, effectively asking employees to be on-call at all times. This made it difficult, if not impossible, to take on a second job or manage other responsibilities.
Historically, one of the key victories for American labor in the 20th century was establishing limits on how much employers could demand of workers in terms of hours. But in recent decades, a different form of exploitation has emerged: not overwork, but underwork. Instead of employing full-time staff, companies increasingly rely on a pool of part-time workers, each scheduled for just enough hours to avoid triggering benefits or overtime. These workers face constant uncertainty—not knowing when they’ll work, how much they’ll earn, or whether their income will cover basic needs.
This shift has turned the original goals of labor reform on their head. Where early labor movements fought to prevent grueling 60- to 80-hour workweeks for poverty wages, many workers today struggle with the opposite problem: too few hours to earn a living. This erosion of stable, full-time employment has reshaped the working-class experience in the U.S.
Modern American ideas about work were shaped in 1940, with the passage of a key New Deal law the Fair Labor Standards Act (FLSA). Before then, it wasn’t uncommon for laborers to work extremely long hours for barely enough to survive. Even the industrial jobs that are now looked back on nostalgically were often dangerous and poorly paid.
The FLSA was a turning point. It set a federal minimum wage, curtailed child labor, and required employers to pay overtime for most non-managerial roles beyond 40 hours per week. Alongside the rise of labor unions, the law helped redefine what it meant to work in America. A belief emerged that if you put in the effort, you deserved more than just getting by you were entitled to a fair wage and a decent life. When pop culture references the “9-to-5” grind, it’s not just a catchphrase it reflects a standard of work that was built by deliberate policy choices and hard-won labor victories.