This thesis uses the local sensitivity of M-estimators to address a number of
extant problems in Bayesian and frequentist statistics. First, by exploiting a
duality from the Bayesian robustness literature between sensitivity and
covariances, I provide significantly improved covariance estimates for mean
field variational Bayes (MFVB) procedures at little extra computational cost.
Prior to this work, applications of MFVB have arguably been limited to
prediction problems rather than inference problems for lack of reliable
uncertainty measures. Second, I provide practical finite-sample accuracy bounds
for the ``infinitesimal jackknife'' (IJ), a classical measure of local
sensitivity to an empirical process. In doing so, I bridge a gap between
classical IJ theory and recent machine learning practice, showing that stringent
classical conditions for the consistency of the IJ can be relaxed for restricted
but useful classes of weight vectors, such as those of leave-K-out cross
validation. Finally, I provide techniques to quantify the sensitivity of the
inferred number of clusters in Bayesian nonparametric (BNP) unsupervised
clustering problems to the form of the Dirichlet process prior. By considering
local sensitivity to be an approximation to global sensitivity rather than a
measure of robustness per se, I provide tools with considerably
improved ability to extrapolate to different priors. Because each of these
diverse applications are based on the same formal technique---the Taylor series
expansion of an M-estimator---this work captures in a unified way the
computational difficulties associated with each, and I provide open-source tools
in Python and R to assist in their computation.