CRNCH Summit 2022 - Shimeng Yu - In-Memory Computing Benchmarking and Future Trends
From Jeffrey Young
views
comments
From Jeffrey Young
Abstract: In this presentation, we will survey the recent progress of in-memory computing for AI hardware acceleration. We will present a benchmark tool to holistically evaluate different semiconductor memory technologies for their applications in deep neural network inference. We will first discuss compute-in-embedded memories for MB model and then compute-in-storage memories for GB~TB model. We will highlight ferroelectric field effect transistor (FeFET) as one of the promising candidates for future research.
Bio: Shimeng Yu is an associate professor of electrical and computer engineering at the Georgia Institute of Technology. He received the B.S. degree in microelectronics from Peking University in 2009, and the M.S. degree and Ph.D. degree in electrical engineering from Stanford University in 2011 and 2013, respectively. From 2013 to 2018, he was an assistant professor at Arizona State University.
Prof. Yu’s research expertise is on the emerging non-volatile memories for applications such as deep learning accelerator, in-memory computing, 3D integration, and hardware security.
Among Prof. Yu’s honors, he was a recipient of NSF Faculty Early CAREER Award in 2016, IEEE Electron Devices Society (EDS) Early Career Award in 2017, ACM Special Interests Group on Design Automation (SIGDA) Outstanding New Faculty Award in 2018, Semiconductor Research Corporation (SRC) Young Faculty Award in 2019, IEEE Circuits and Systems Society (CASS) Distinguished Lecturer for 2021-2022, and IEEE Electron Devices Society (EDS) Distinguished Lecturer for 2022-2023, etc. He is a senior member of the IEEE.