#
optimizers
Optimizers are how we tune the weights of our neural networks. The easiest to understand of them is stochastic gradient descent, however there are many more. You can see some of those options from Pytorch here.