Sampling is All You Might Need
Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Sampling is All You Might Need

Abstract

We propose two gradient-free optimization methods to solve Ordinary Least Squareslinear regression problems, focusing on the use of Monte Carlo and Bagging Monte Carlo techniques. These methods leverage multiple core processors to iteratively generate sample betas within a restricted search space and record the mean squared error. While exploring the parameter space, our results do not conclusively align with the double descent narrative commonly discussed in prior literature, which primarily links this phenomenon to gradient descent methods and their implicit norm-minimization bias. Our experimental comparisons reveal that these gradient-free approaches yield competitive performance metrics, including mean squared error, R2, and L2 Norm, which are on par with those achieved by traditional Ordinary Least Squares and Gradient Descent methods.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View