Workers in climate exposed industries such as agriculture, construction, and manufacturing face increased health risks of working on high temperature days and may make decisions to reduce work on high-heat days to mitigate this risk. Utilizing the American Time Use Survey (ATUS) for the period 2003 through 2018 and historical weather data, we model the relationship between daily temperature and time allocation, focusing on hours worked by high-risk laborers. The results indicate that labor allocation decisions are context specific and likely driven by supply-side factors. We do not find a significant relationship between temperature and hours worked during the Great Recession (2008-2014), perhaps due to high competition for employment, however during periods of economic growth (2003-2007, 2015-2018) we find a significant reduction in hours worked on high-heat days. During periods of economic growth, for every degree above 90 on a particular day, the average high-risk worker reduces their time devoted to work by about 2.6 minutes relative to a 90-degree day. This effect is expected to intensify in the future as temperatures rise. Applying the modeled relationships to climate projections through the end of century, we find that annual lost wages resulting from decreased time spent working on days over 90 degrees across the United States range from $36.7 to $80.0 billion in 2090 under intermediate and high emission futures, respectively.