Fix CrossEntropyLoss Parameters Deprecation in PyTorch

Fix CrossEntropyLoss Parameters Deprecation in PyTorch

While training models in PyTorch, you might encounter deprecation warnings related to the CrossEntropyLoss configuration. These warnings appear when older parameters such as size_average or reduce are used. PyTorch is gradually deprecating these parameters in favor of a single, clearer argument reduction.

The deprecation exists to simplify the API and eliminate confusion caused by combining multiple flags that control the same behavior. Instead of toggling size_average and reduce, PyTorch now expects you to explicitly define how the loss should be aggregated.

Here is an example that triggers the deprecation warnings:

import torch

x = torch.tensor([[1.5, 2.5, 3.5], [2.0, 1.0, 0.5]])
target = torch.tensor([1, 0])

loss = torch.nn.CrossEntropyLoss(size_average=True)
y = loss(x, target)
print(y) # tensor(0.9360)

loss = torch.nn.CrossEntropyLoss(size_average=False)
y = loss(x, target)
print(y) # tensor(1.8720)

loss = torch.nn.CrossEntropyLoss(reduce=False)
y = loss(x, target)
print(y) # tensor([1.4076, 0.4644])

Output:

UserWarning: size_average and reduce args will be deprecated, please use reduction='mean' instead.
UserWarning: size_average and reduce args will be deprecated, please use reduction='sum' instead.
UserWarning: size_average and reduce args will be deprecated, please use reduction='none' instead.

To fix deprecations, use the reduction parameter directly when creating the loss function. This is the modern and forward-compatible solution:

import torch

x = torch.tensor([[1.5, 2.5, 3.5], [2.0, 1.0, 0.5]])
target = torch.tensor([1, 0])

loss = torch.nn.CrossEntropyLoss(reduction='mean')
y = loss(x, target)
print(y) # tensor(0.9360)

loss = torch.nn.CrossEntropyLoss(reduction='sum')
y = loss(x, target)
print(y) # tensor(1.8720)

loss = torch.nn.CrossEntropyLoss(reduction='none')
y = loss(x, target)
print(y) # tensor([1.4076, 0.4644])

Leave a Comment

Cancel reply

Your email address will not be published.