Functions that operate on tensors


source

embedding

 embedding (input:tidygrad.tensor.Tensor, indices, name=None)

source

dropout

 dropout (x:tidygrad.tensor.Tensor, p=0.5, training=True)

Apply Dropout to a tensor


source

transpose

 transpose (a:tidygrad.tensor.Tensor, dim0, dim1, name=None)

Transpose a tensor


source

slice

 slice (a:tidygrad.tensor.Tensor, key, name=None)

source

broadcast

 broadcast (a:tidygrad.tensor.Tensor, target_shape, name=None)

Broadcast a tensor to the given shape


source

sum

 sum (a:tidygrad.tensor.Tensor, name=None, axis=None, keepdims=False)

Sum-reduce a tensor along the given axis (int or tuple of ints)


source

matmul

 matmul (a:tidygrad.tensor.Tensor, b:tidygrad.tensor.Tensor, name=None)

Matrix multiplication of two tensors


source

logexp

 logexp (a:tidygrad.tensor.Tensor, name=None)

Exponentiate a tensor


source

exp

 exp (a:tidygrad.tensor.Tensor, name=None)

Exponentiate a tensor


source

log

 log (a:tidygrad.tensor.Tensor, name=None)

Take the natural logarithm of a tensor


source

pow

 pow (a:tidygrad.tensor.Tensor, power:tidygrad.tensor.Tensor, name=None)

Raise a tensor to a power (a**power)


source

neg

 neg (a:tidygrad.tensor.Tensor, name=None)

Negate a tensor (-a)


source

div

 div (a:tidygrad.tensor.Tensor, b:tidygrad.tensor.Tensor, name=None)

Divide two tensors (a/b)


source

mul

 mul (a:tidygrad.tensor.Tensor, b:tidygrad.tensor.Tensor, name=None)

Multiply two tensors


source

sub

 sub (a:tidygrad.tensor.Tensor, b:tidygrad.tensor.Tensor, name=None)

Subtract two tensors


source

add

 add (a:tidygrad.tensor.Tensor, b:tidygrad.tensor.Tensor, name=None)

Add two tensors


source

relu

 relu (input:tidygrad.tensor.Tensor, name=None)

source

tanh

 tanh (input:tidygrad.tensor.Tensor, name=None)

source

sigmoid

 sigmoid (input:tidygrad.tensor.Tensor, name=None)

source

gelu

 gelu (input:tidygrad.tensor.Tensor)

source

sigmoid_gelu

 sigmoid_gelu (x:tidygrad.tensor.Tensor)

source

softmax

 softmax (input:tidygrad.tensor.Tensor, name=None)

source

layer_norm

 layer_norm (x:tidygrad.tensor.Tensor, w:tidygrad.tensor.Tensor,
             b:tidygrad.tensor.Tensor, eps=1e-05)

source

concat

 concat (tensors:list[tidygrad.tensor.Tensor], axis=0, name=None)

source

stack

 stack (tensors:list[tidygrad.tensor.Tensor], axis=0, name=None)

source

BCE_loss

 BCE_loss (logits:tidygrad.tensor.Tensor, target:tidygrad.tensor.Tensor,
           reduction='mean')

source

CrossEntropy_loss

 CrossEntropy_loss (logits:tidygrad.tensor.Tensor,
                    target:tidygrad.tensor.Tensor, reduction='mean')