[python] Pytorch tensor to numpy array

While other answers perfectly explained the question I will add some real life examples converting tensors to numpy array:

Example: Shared storage

PyTorch tensor residing on CPU shares the same storage as numpy array na

import torch
a = torch.ones((1,2))
print(a)
na = a.numpy()
na[0][0]=10
print(na)
print(a)

Output:

tensor([[1., 1.]])
[[10.  1.]]
tensor([[10.,  1.]])

Example: Eliminate effect of shared storage, copy numpy array first

To avoid the effect of shared storage we need to copy() the numpy array na to a new numpy array nac. Numpy copy() method creates the new separate storage.

import torch
a = torch.ones((1,2))
print(a)
na = a.numpy()
nac = na.copy()
nac[0][0]=10
?print(nac)
print(na)
print(a)

Output:

tensor([[1., 1.]])
[[10.  1.]]
[[1. 1.]]
tensor([[1., 1.]])

Now, just the nac numpy array will be altered with the line nac[0][0]=10, na and a will remain as is.

Example: CPU tensor with requires_grad=True

import torch
a = torch.ones((1,2), requires_grad=True)
print(a)
na = a.detach().numpy()
na[0][0]=10
print(na)
print(a)

Output:

tensor([[1., 1.]], requires_grad=True)
[[10.  1.]]
tensor([[10.,  1.]], requires_grad=True)

In here we call:

na = a.numpy() 

This would cause: RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead., because tensors that require_grad=True are recorded by PyTorch AD. Note that tensor.detach() is the new way for tensor.data.

This explains why we need to detach() them first before converting using numpy().

Example: CUDA tensor with requires_grad=False

a = torch.ones((1,2), device='cuda')
print(a)
na = a.to('cpu').numpy()
na[0][0]=10
print(na)
print(a)

Output:

tensor([[1., 1.]], device='cuda:0')
[[10.  1.]]
tensor([[1., 1.]], device='cuda:0')

?

Example: CUDA tensor with requires_grad=True

a = torch.ones((1,2), device='cuda', requires_grad=True)
print(a)
na = a.detach().to('cpu').numpy()
na[0][0]=10
?print(na)
print(a)

Output:

tensor([[1., 1.]], device='cuda:0', requires_grad=True)
[[10.  1.]]
tensor([[1., 1.]], device='cuda:0', requires_grad=True)

Without detach() method the error RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead. will be set.

Without .to('cpu') method TypeError: can't convert cuda:0 device type tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first. will be set.

You could use cpu() but instead of to('cpu') but I prefer the newer to('cpu').

Examples related to python

programming a servo thru a barometer Is there a way to view two blocks of code from the same file simultaneously in Sublime Text? python variable NameError Why my regexp for hyphenated words doesn't work? Comparing a variable with a string python not working when redirecting from bash script is it possible to add colors to python output? Get Public URL for File - Google Cloud Storage - App Engine (Python) Real time face detection OpenCV, Python xlrd.biffh.XLRDError: Excel xlsx file; not supported Could not load dynamic library 'cudart64_101.dll' on tensorflow CPU-only installation

Examples related to numpy

Unable to allocate array with shape and data type How to fix 'Object arrays cannot be loaded when allow_pickle=False' for imdb.load_data() function? Numpy, multiply array with scalar TypeError: only integer scalar arrays can be converted to a scalar index with 1D numpy indices array Could not install packages due to a "Environment error :[error 13]: permission denied : 'usr/local/bin/f2py'" Pytorch tensor to numpy array Numpy Resize/Rescale Image what does numpy ndarray shape do? How to round a numpy array? numpy array TypeError: only integer scalar arrays can be converted to a scalar index

Examples related to pytorch

Pytorch tensor to numpy array How to initialize weights in PyTorch? How to check if pytorch is using the GPU? PyTorch: How to get the shape of a Tensor as a list of int Pytorch reshape tensor dimension Best way to save a trained model in PyTorch? Model summary in pytorch How does the "view" method work in PyTorch?