SERVICE PHONE

400-123-4657
  • 诚信为本,市场在变,诚信永远不变...

富联代理

更多+
电话:400-123-4657
传真:+86-123-4567
地址:广东省广州市天河区88号
邮箱:admin@youweb.com

公司动态

当前位置: 首页 > 富联动态 > 公司动态

优化器的作用_2

发布时间:2024-06-24 点击量:76

pytorch中优化器的作用是把参数修正代码单独分离出来供直接调用。

示例:

没有优化器的代码

def training_loop(n_epochs, learning_rate, params, t_u, t_c):
    for epoch in range(1, n_epochs + 1):
        if params.grad is not None:
              params.grad.zero_()
        t_p = model(t_u, *params)
        loss = loss_fn(t_p, t_c)
        loss.backward()
        params = (params - learning_rate * params.grad).detach().requires_grad_()
        if epoch % 500 == 0:
            print('Epoch %d, Loss %f' % (epoch, float(loss)))
	    return params 

使用优化器的代码

def training_loop(n_epochs, optimizer, params, t_u, t_c):
    for epoch in range(1, n_epochs + 1):
        t_p = model(t_u, *params)
        loss = loss_fn(t_p, t_c)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        if epoch % 500 == 0:
            print('Epoch %d, Loss %f' % (epoch, float(loss)))
    return params
# In[10]: 
params = torch.tensor([1.0, 0.0], requires_grad=True)
learning_rate = 1e-2 
optimizer = optim.SGD([params], lr=learning_rate) 
training_loop(
    n_epochs = 5000,
    optimizer = optimizer,
    params = params,
    t_u = t_un,
    t_c = t_c)

平台注册入口