The absence of microbial exposure early in life leaves individuals vulnerable to immune overreaction later in life, manifesting as immunopathology, autoimmunity, or allergies. A key factor is thought to be a "critical window" during which the host's immune system can "learn" tolerance, and beyond which learning is no longer possible. Animal models indicate that many mechanisms have evolved to enable critical windows, and that their time limits are distinct and consistent. Such a variety of mechanisms, and precision in their manifestation suggest the outcome of strong evolutionary selection. To strengthen our understanding of critical windows, we explore their underlying evolutionary ecology using models encompassing demographic and epidemiological transitions, identifying the length of the critical window that would maximize fitness in different environments. We characterize how direct effects of microbes on host mortality, but also indirect effects via microbial ecology, will drive the optimal length of the critical window. We find that indirect effects such as magnitude of transmission, duration of infection, rates of reinfection, vertical transmission, host demography, and seasonality in transmission all have the effect of redistributing the timing and/or likelihood of encounters with microbial taxa across age, and thus increasing or decreasing the optimal length of the critical window. Declining microbial population abundance and diversity are predicted to result in increases in immune dysfunction later in life. We also make predictions for the length of the critical window across different taxa and environments. Overall, our modeling efforts demonstrate how critical windows will be impacted over evolution as a function of both host-microbiome/pathogen interactions and dispersal, raising central questions about potential mismatches between these evolved systems and the current loss of microbial diversity and/or increases in infectious disease.