BLOG: Exploring Stochastic Gradient Descent with Restarts (SGDR)

Published on Medium: This blog post discusses my journey into deep learning, focusing on the concept of Stochastic Gradient Descent with Restarts (SGDR). The author explains the basics of deep learning, including the use of loss functions and the idea of gradient descent to minimize these functions. The post then introduces SGDR, a method that resets the learning rate periodically to avoid getting stuck in a local minimum. The author suggests that this method, especially when combined with a progressively lengthening cycle, can improve the performance of deep learning models. The post emphasizes that many cutting-edge ideas in deep learning are simple modifications of base methodologies, and are relatively easy to understand and implement.

You can find the blog in it's entirety here: https://markkhoffmann.medium.com/exploring-stochastic-gradient-descent-with-restarts-sgdr-fa206c38a74e