Simple demos for PyTorch.
Reference: PyTorch online tutorials
Reference
This demo includes:
- How to construct a Tensor in PyTorch
- Basic operations that can be applied to Tensor objects
- How to convert a PyTorch-Tensor to Numpy-Ndarray, and vice versa.
- How to move Tensors onto GPU for efficient computation.
Reference
This demo shows how to compute gradients.
This demo construct a basic linear regression model. The two scripts shows two equivalent implementation: using torch.nn and torch.nn.functional, respectively.
Train a simple CNN model on MNIST dataset.
One disadvantage of PyTorch is that it doesn't have its own visualization toolkit. But it is possible to use Tensorboard in PyTorch, thanks to tensorboardX. (Of course, you need to install tensorlfow first!)
- Install tensorboardX
pip install tensorboardX
- Import tensorboardX
from tensorboardX import SummaryWriter
- Save tensorboard logs
# (1) Log the scalar values
writer.add_scalar('loss', running_loss/100, step)
writer.add_scalar('accuracy', 100*correct/total, step)
# (2) Log values and gradients of the parameters (histogram)
for tag, value in net.named_parameters():
tag = tag.replace('.', '/')
writer.add_histogram(tag, to_np(value), step)
writer.add_histogram(tag+'/grad', to_np(value.grad), step)
# (3) Log the images
images = utils.make_grid(inputs.view(-1,3,32,32).data, nrow=5,
normalize=True, scale_each=True)
writer.add_image('Image', images, step)
Reference
One important thing in practice: loading your own dataset. If you are a former Tensorflow user, you must be familiar with constructing TFrecords files. PyTorch also provides an efficient way to load data.
- First, inherit from torch.utils.data.Dataset and overwirte __len__ & __getitem__
- overwirte __len__ so that len(dataset) returns the size of the dataset.
- overwirte __getitem__ to support the indexing such that dataset[i] can be used to get the ith sample
- Senond, instantiate torch.utils.data.DataLoader
- Iterate through the entier dataset, getting one batch each time for backpropgation.