In February 2020, Facebook announced the ten winners of the annual Systems for Machine Learning (ML) Research Awards. Facebook’s main purpose of presenting such awards is to not only deepen collaboration with academia, but also potentially find impactful solutions in the areas of developer toolkits, compilers/code generation, system architecture, memory technologies, and ML accelerator support. Professor Minsoo Rhu from the School of Electrical Engineering at KAIST proudly stood among the award recipients. The KAIST Herald interviewed Professor Rhu, who has done his research on “A Near-Memory Processing Architecture for Training Recommendation Systems”.
Can you summarize your work?
Recommendation systems are actually something that we are [all] very familiar with. Netflix, YouTube, and Facebook have recommendation videos or advertisements that incorporate AI technology. In a computer, when this recommendation system is executed, the memory and processing systems are both used; it relies on the memory system significantly more because there is so much data that has to be brought from the memory. Simply put, this research aims to do most of the calculations done in the processing system within the memory system to increase efficiency in data processing.
What does your research contribute as a solution to the topics prioritized by Facebook?
On Facebook, there are a lot of advertisements or videos that are personalized to the user. This is called personalized recommendation. Especially since companies like Facebook earn approximately 90% of their profits through such methods, it is of high importance for such media-based corporations. Traditional systems always have a limit in their computing capacity, but what we’ve tried to do is develop a new method to provide not only an efficient way of gaining revenue to corporations but also a quicker and more interactive interface for the users.
How did your research on “Memory-Centric Deep Learning System Architecture” under Samsung Science and Technology Foundation help in receiving this award?
The idea of processing data near the memory has already been present since the 1970s. However, the reason why consumable devices do not embody this method is because the only companies that can properly make these memory-centered devices are Samsung and SK Hynix. Suppliers like Samsung and SK Hynix are usually limited by processor producers in the United States to producing only memory components. These processor producers would not want to compete with Korean companies producing memory-centric devices that can also do important computational processing. Samsung has always been interested in expanding its market to include such multifunctional memory devices, so with its help and some investment, I could carry out more research on memory-centric processing. This research has helped the process of applying the technology to recommendation systems, and Facebook seemed to have taken interest in our research.
What is your advice for undergraduate or graduate students that are interested in research?
There are a lot of courses related to many different majors in electrical engineering. Because there are so many courses in the undergraduate curriculum, it is normal that students have difficulty in finding what they enjoy doing and what they want to do in the future. That is why it is important to get to experience at least two to three labs. If possible, undergraduate students should do individual studies in different laboratories and experience different fields [before they graduate].
Apart from Professor Rhu, nine other winners were nominated for various categories ranging from compiling neural networks to using efficient deep learning for security purposes. These ten recipients will be given 50,000 USD in research funds and will be invited to the next AI Systems Faculty Summit in Fall 2020. Congratulations Professor Rhu!