This dissertation explores the applications of Monte Carlo and Bootstrap methods in stochasticoptimization, focusing on enhancing computational efficiency and accuracy in solution evaluation
and uncertainty quantification. For the purpose of computing the expected value of a stochastic
optimization problem via simulation, we propose a method to efficiently construct importance
sampling distributions using surrogate modeling. This method significantly reduces the need for
repeated evaluations of the objective function, which are typically computationally intensive due
to the reliance on optimization algorithms. Our method outperforms traditional Monte Carlo
estimation and achieves significant speed-ups with good parallel efficiency. Additionally, we explore
Monte Carlo sampling algorithms that utilize known distributions to construct confidence intervals
around the optimality gap in two-stage and multi-stage stochastic programs. In scenarios where
the distribution of uncertainty remains unknown, we discuss bootstrap and bagging algorithms
that rely solely on sampled data to provide both a consistent sample-average solution and accurate
confidence interval estimates. These methods are enhanced by integrating distribution estimation
into resampling techniques to improve the precision of estimations under uncertainty. We also
offer open-source software implementations of these algorithms. Extensive empirical studies show
the effectiveness of the smoothed bootstrap and bagging methods, particularly for constructing
confidence intervals with small data sets.