Efficiently capturing high-dimensional optical signals, such as temporal dynamics, depth, perspective, or spectral content is a difficult imaging challenge. Because image sensors are inherently two-dimensional, direct sampling of the many dimensions that completely describe a scene presents a significant engineering challenge. Computational imaging is a design approach in which imaging hardware and digital signal processing algorithms are designed jointly to achieve performance not possible with partitioned design schemes. Within this paradigm, the sensing hardware is viewed as an encoder, coding the information of interest into measurements that can be captured with conventional sensors. Algorithms are then used to decode the information. In this dissertation, I explore the connection between optical imaging system design and compressed sensing, demonstrating that extra dimensions of optical signals (time, depth, and perspective) can be encoded into a single 2D measurement, then extracted using sparse recovery methods. The key to these capabilities is exploiting the inherent multiplexing properties of diffusers, pseudorandom free-form phase optics that scramble incident light. Contrary to their intended use, I show that certain classes of diffuser encode high-dimensional information about the incident light field into high-contrast, pseudorandom intensity patterns (caustics). Sparse recovery methods can then decode these patterns, recovering 3D images from snapshot 2D measurements. This transforms a diffuser into a computational imaging element for high-dimensional capture at video rates. Efficient physical models are introduced that reduce the computational burden for image recovery as compared to explicit matrix approaches (the computational cost remains high, however). Lastly, analysis and theory is developed that enables optimization of customized diffusers for miniaturized 3D fluorescence microscopy.