- Main
Memory-Centric Architectures for Big Data Applications
- Lin, Jilan
- Advisor(s): Xie, Yuan;
- Ding, Yufei
Abstract
Big Data refers to the massive and rapidly growing data in our daily life, which can be very helpful to dig valuable information and make better decisions. However, handling big data workloads poses new challenges in traditional memory subsystems due to the memory wall and power wall issues. Moving a large amount of data from memory to the processor is expensive, which can cause severe performance bottleneck and high energy consumption. Moreover, while the computation capability of modern processors grows fast with Moore’s Law, the bandwidth and latency of memory improve much slower due to the I/O’s physical constraints.
To address the memory wall challenge for big data applications, this dissertation aims to answer the following two questions: How can we improve the existing memory tech- nology, and what should future memory look like? For the first question, we investigate the approach of near-data processing (NDP). NDP technique adopts custom compute logic near the data storage, which leverages the much higher internal bandwidth and reduces the energy consumption by data movements. Second, we envision the in-memory processing (IMP) technique for next-generation memory. In particular, we study the Resistive Random Access Memory (RRAM), an analog device with tunable resistance. We use RRAM as both storage and computation hardware to process data in-situ and analyze its mapping and reliability issues.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-