Photo by Tanner Boriack on Unsplash
01- PyTorch Tensors
Get to know about some PyTorch fundamentals in this PyTorch series
What is PyTorch !!?
According to their official website [https://pytorch.org/] PyTorch
is An open source machine learning framework that accelerates the path from research prototyping to production deployment.
And when you move to their GitHub page [https://github.com/pytorch/pytorch] you will find these points At the core PyTorch
is a Python
the package that provides two high-level features: Tensor computation (like NumPy) with strong GPU acceleration and Deep neural networks built on a tape-based autograd system We can reuse our favourite python packages such as NumPy
, SciPy
, and Cython
to extend Pytorch
when needed Alright, too much theory!! Let's look at some code, later we can learn some more theory parts or fundamentals Note: Any of these are not my own words, I crawl around websites and try to bring the best here 🤪!! Sometimes it's harder to keep everything in one place; So I did this...[Ohho!!].
Tensors
As I said earlier at its core, Pytorch is a library for processing tensors. Instead of saying blah blah blah about tensors, in simple words tensor is just a collection of numbers in a specific shape (really!?) which can run on both CPU and GPU. Or In the general case, an array of numbers arranged on a regular grid with a variable number of axes is known as a tensor.
- link Or A tensor is a number, vector, matrix or any n-dimensional array.
So let's create a tensor with a single number.
Ok, until now we have seen pretty simple tensors, Let's make some more with more complexity..... [Oh, No..!!]
Here I am mentioning again that in the above example what it's saying is a tensor with 3 layers/ depths with 2 rows with 3 items. In simple words 3 two-dimensional arrays
Automatic Differentiation
When training a neural network, as you might have heard the most frequently used algorithm is backpropagation
[really?]. The backpropagation algorithm
really requires a gradient of the loss function[more about the later [but why!??😭]]
Let's look at an example
PS: It's not the pure example I am putting here; It's from PyTorch's official website. Credits are given in the end 🤷
What we did was a simple arithmetic operation with the tensors that we created. It's somewhat related to ML/DL field. We can learn about it later. For now, let's just see it as an arithmetic operation. Here with the help of PyTorch, we can automatically compute the derivative of y.
Some more theories about PyTorch
🤪
Let's continue with some theory parts In PyTorch
we can reuse our favourite Python packages such as NumPy
, SciPy
and Cython
to extend PyTorch when we needed
To tell more about PyTorch :
A GPU- Ready Tensor Library
Dynamic Neural Networks: Tape-Based Autograd
Python First
Imperative Experiences
Fast and Lean
Extension without pain
At a granular level, PyTorch is a library that consists of the following components:
torch: A Tensor library like NumPy, with strong GPU support
torch. autograd: A tape-based automatic differentiation library that supports all differentiable Tensor operations in torch
torch.jit: A compilation stack (TorchScript) to create serializable and optimizable models from PyTorch code
torch.nn: A neural networks library deeply integrated with autograd designed for maximum flexibility
torch.multiprocessing: Python multiprocessing, but with magical memory sharing of torch Tensors across processes. Useful for data loading and Hogwild training
torch.utils: DataLoader and other utility functions for convenience
Conclusion and credits
This is just an introduction that I have done here. More things on PyTorch will be coming soon. With all respect, I am giving credit to the authors or notebooks that I have referred to for this notebook/blog.