Make Stochastic Gradient Descent Fast Again (Ep. 113) - a podcast by Francesco Gadaleta

from 2020-07-22T10:53:18

:: ::

There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.


Join our Discord channel and chat with us.


 


References

 

Further episodes of Data Science at Home

Further podcasts by Francesco Gadaleta

Website of Francesco Gadaleta