• xuwenzhuo

    运行.backward,观察调用之前和调用之后的grad

    net.zero_grad() # 把net中所有可学习参数的梯度清零
    print(‘反向传播之前 conv1.bias的梯度’)
    print(net.conv1.bias.grad)
    loss.backward()
    print(‘反向传播之后 conv1.bias的梯度’)
    print(net.conv1.bias.grad)

    RuntimeError Traceback (most recent call last)
    ~\AppData\Local\Temp\ipykernel_21016\233692853.py in <module>
    3 print(‘反向传播之前 conv1.bias的梯度’)
    4 print(net.conv1.bias.grad)
    ——> 5 loss.backward()
    6 print(‘反向传播之后 conv1.bias的梯度’)
    7 print(net.conv1.bias.grad)

    ~\AppData\Roaming\Python\Python39\site-packages\torch_tensor.py in backward(self, gradient, retain_graph, create_graph, inputs)
    490 inputs=inputs,
    491 )
    —> 492 torch.autograd.backward(
    493 self, gradient, retain_graph, create_graph, inputs=inputs
    494 )

    ~\AppData\Roaming\Python\Python39\site-packages\torch\autograd_init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables, inputs)
    249 # some Python versions print out the first line of a multi-line function
    250 # calls in the traceback and some print out the last line
    —> 251 Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
    252 tensors,
    253 grad_tensors
    ,

    RuntimeError: Found dtype Long but expected Float

    xuwenzhuo发表于 2024/1/19 11:24:52
    • xuwenzhuo

      第41页第二章反向传播loss.backward()报错快速入门数据类型不一致

      xuwenzhuo发表于 2024/1/19 11:27:00
  • QINWEIRONG

    零基础,求带

    QINWEIRONG发表于 2023/12/19 21:28:02
  • Jack

    在下载的Pytorch book.zip中第九章没有data目录。执行程序时找不到相关的data

    Jack发表于 2022/4/6 11:38:41
    • Jack

      也找不到tang.npz,与书中说的”二进制文件tang.npz已在本书附带代码中提供“不符合

      Jack发表于 2022/4/6 11:41:06
  • 桂志勇

    chapter6中没有best—pratice,第六章没有猫狗大战的代码

    桂志勇发表于 2022/1/13 10:03:27
  • qbl

    P80的y_pred = x.mm(w) + b.expand_as(y)为什么会报错:
    RuntimeError: expected scalar type Long but found Float

    qbl发表于 2021/5/24 19:00:19
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6