You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: beginner_source/basics/optimization_tutorial.py
+1-1
Original file line number
Diff line number
Diff line change
@@ -76,7 +76,7 @@ def forward(self, x):
76
76
# (`read more <https://pytorch.org/tutorials/beginner/hyperparameter_tuning_tutorial.html>`__ about hyperparameter tuning)
77
77
#
78
78
# We define the following hyperparameters for training:
79
-
# - **Number of Epochs** - the number times to iterate over the dataset
79
+
# - **Number of Epochs** - the number of times to iterate over the dataset
80
80
# - **Batch Size** - the number of data samples propagated through the network before the parameters are updated
81
81
# - **Learning Rate** - how much to update models parameters at each batch/epoch. Smaller values yield slow learning speed, while large values may result in unpredictable behavior during training.
0 commit comments