一种实用的降学习率公式:
# set lr rate
current_lr = 0.0002 * (1 / 2) ** (step / 10000)
for param_group in optimizer_G.param_groups:
param_group["lr"] = current_lr
具体使用方法:
step = 0
for epoch in range(opt.epoch, opt.n_epochs):
for i, batch in enumerate(dataloader):
step = step + 1
# set lr rate
current_lr = 0.0002 * (1 / 2) ** (step / 10000)
for param_group in optimizer_G.param_groups:
param_group["lr"] = current_lr