Honors Oral Exam

Improving Optimization in Stochastic Gradient Descent using Previous Weights

Yukun Yang (University of Rochester)

Friday, May 5th, 2023
10:00 AM - 10:50 AM
Hylan 105

Transfer learning is a technique used to enhance the performance of machine learning models by utilizing knowledge obtained from related tasks. This study examines the impact of transfer learning on the perfor- mance of stochastic gradient descent (SGD) optimization in the context of synthetic data and diverse pairs of mathematical functions, representing varying degrees of relatedness. Our findings indicate that the effective- ness of transfer learning with SGD is highly dependent on the level of similarity between the functions. We provide insights into the conditions under which transfer learning is more or less advantageous, as well as an empirical analysis of function pairs to illustrate the performance of SGD in different transfer learning scenarios. By deepening our understanding of the interaction between transfer learning and stochastic gradient de- scent, we aim to facilitate the development of more efficient and effective machine learning models and applications.

Event contact: jonathan dot pakianathan at rochester dot edu