Deep Learning building blocks: The most 6 common PyTorch functions with their implementaion
Hi all, in this short introduction about PyTorch, I choose the most 6 main functions that are a must in most of Deep Learning models nowadays:
torch.view()
torch.reshape()
torch.permute()
torch.flatten()
torch.cat()
torch.numel()
Particularly, I chose them not only because they are famous, but also because some of them can generate some confusion as they behave quite similarly.
Additional frequent functions (list of more than 20 functions)
An short introduction about PyTorch and about the chosen functions.
function 1: numel: which return the number of elements.
function 2: as_tensor: convert the data in to torch.Tensor.
function 3: from_numpy: creates a tensor from numpy array.
function 4: reshape: returns the tensor with the same data and number of elements as input, but with specified shape.
function 5: save and load: used for serialization both saving and loading the object to and from the disk.
function 6: abs: computes the absolute value of each element in input.
functin 7. absolute: allise for torch.abs
function 8. torch.clam: retur a new tensor of clipping the input between min and max.
funciton 9. mul, return a new tensor retsulting of multiplying a scalar with a tensor.
function 10: argmax: return the indices of the maximum value of all elements in the input tensor.
functions 11: min,max,std, var, std, std_mean, var_mean, unique, count_nonzero , etc are some of the tensors statistics
function 12: eq computes emlement wise equility.
function 13: equal: returns true where two tensor have the same size and elements and zero otherwise.
functions 14: fft, ifft, rfft, irfft, stft, istft, bartlett_window, blackman_window, hamming_window, hann_window and kaser_window are common spectral operations.
function 15: flatten : flattens a continuous range of dims in tensor.
funciton 16: flip, fliplr, flipup, rot90 : are common funcs used for augmentation.
function 17: addbmm, addmm, addmv, addr, baddbmm, bmm, chain_matmul, dot, eigh, det, matmul, mm, matmul, mv, orgqr, qr, solve, svd, pca_lowrank, vdot: are most common funciton in multipations of BLAS and LAPACK Operations
function 18: assert: A wrapper around Python’s assert which is symbolically traceable.
funciton 19: braodcast_tensors: Broadcasts the given tensors according to Broadcasting semantics.
function 20: clone: return a copy of input tensor.