Skip to content Skip to sidebar Skip to footer

How Do I One Hot Encode Along A Specific Dimension Using PyTorch?

I have a tensor of size [3, 15, 136], where: 3 is batch size 15 - sequence length and 136 is tokens I want to one-hot my tensor using the probabilities in the tokens dimension (1

Solution 1:

You can use PyTorch's one_hot function to achieve this:

import torch.nn.functional as F

t = torch.rand(3, 15, 136)

F.one_hot(t.argmax(dim=2), 136)

Post a Comment for "How Do I One Hot Encode Along A Specific Dimension Using PyTorch?"