When it comes to defining learning rate schedules in PyTorch, you have plenty of options. 15 different scheduler classes, to be exact. This offers good flexibility, but it sometimes means that making a small change to your learning rate schedule requires switching to a different scheduler or combining multiple schedulers. — It struck me one day that defining how a learning rate should change over the course of a training run is similar to defining an animation, with a combination of: