Nutrient loss reduction strategies have recently been developed in the U.S. Midwest to decrease the environmental footprint associated with nitrogen (N) fertilizer use. Although these strategies generally suggest decreasing N rates and shifting the timing of N application from fall to spring, the spatiotemporal impacts of these practices on maize yield and fertilizer N use efficiency (NUE, kg grain yield increase per kg N applied) have not been assessed at the watershed scale using crop simulation models. We simulated the effects of N fertilizer rate (0, 168, 190, 224 kg N ha-1) and application timing [fall-applied N (FN): 100% N applied on 1 December; spring-applied N (SN): 100% N applied 10 days before planting; split N: 66% N applied on 1 December + 34% N applied 10 days before planting] on maize grain yield (GY) across 3042 points in Illinois during 2011-2015 using the DSSAT-CERES-Maize model. When simulations were scaled up to the watershed level, results suggest that increases in average maize GY for SN compared to FN occurred in years with higher than average winter rainfall (2011, 2013), whereas yields were similar (+/- 4%) in 2012, 2014, and 2015. Accordingly, differences in NUE for SN compared to FN were small (0.0-1.4 kg GY/kg N) when cumulative winter rainfall was < 300 mm, but increased to 0.1-9.2 kg GY/kg N when winter rainfall was > 500 mm at both 168 kg N ha-1 and 224 kg N ha-1. The combined practice of reducing N fertilizer amounts from 224 kg N ha-1 to 190 kg N ha-1 and shifting from FN to SN resulted in a wide range of yield responses during 2011-2015, with the probability of increasing yields varying from <10% to >70% of simulation points within a watershed. Positive impacts on both GY and NUE occurred in only 60% of simulations for this scenario, highlighting the challenge of simultaneously improving yield and NUE with a 15% N rate reduction in this region.