## Tensorial.jl

Statically sized tensors and related operations for Julia
Author KeitaNakamura
Popularity
11 Stars
Updated Last
1 Year Ago
Started In
January 2021

# Tensorial

Statically sized tensors and related operations for Julia

Tensorial provides useful tensor operations (e.g., contraction; tensor product, `⊗`; `inv`; etc.) written in the Julia programming language. The library supports arbitrary size of non-symmetric and symmetric tensors, where symmetries should be specified to avoid wasteful duplicate computations. The way to give a size of the tensor is similar to StaticArrays.jl, and symmetries of tensors can be specified by using `@Symmetry`. For example, symmetric fourth-order tensor (symmetrizing tensor) is represented in this library as `Tensor{Tuple{@Symmetry{3,3}, @Symmetry{3,3}}}`. Any tensors can also be used in provided automatic differentiation functions.

## Speed

```a = rand(Vec{3})                         # vector of length 3
A = rand(SecondOrderTensor{3})           # 3x3 second order tensor
S = rand(SymmetricSecondOrderTensor{3})  # 3x3 symmetric second order tensor
B = rand(Tensor{Tuple{3,3,3}})           # 3x3x3 third order tensor
AA = rand(FourthOrderTensor{3})          # 3x3x3x3 fourth order tensor
SS = rand(SymmetricFourthOrderTensor{3}) # 3x3x3x3 symmetric fourth order tensor (symmetrizing tensor)```

See here for above aliases.

Operation `Tensor` `Array` speed-up
Single contraction
`a ⋅ a` 1.428 ns 12.063 ns ×8.4
`A ⋅ a` 1.512 ns 72.174 ns ×47.7
`S ⋅ a` 1.591 ns 71.682 ns ×45.1
Double contraction
`A ⊡ A` 2.722 ns 12.549 ns ×4.6
`S ⊡ S` 2.196 ns 12.767 ns ×5.8
`B ⊡ A` 3.985 ns 162.974 ns ×40.9
`AA ⊡ A` 7.977 ns 173.801 ns ×21.8
`SS ⊡ S` 3.932 ns 174.286 ns ×44.3
Tensor product
`a ⊗ a` 1.809 ns 50.640 ns ×28.0
Cross product
`a × a` 1.809 ns 50.640 ns ×28.0
Determinant
`det(A)` 1.442 ns 201.691 ns ×139.9
`det(S)` 1.680 ns 202.007 ns ×120.2
Inverse
`inv(A)` 7.084 ns 508.010 ns ×71.7
`inv(S)` 4.605 ns 504.208 ns ×109.5
`inv(AA)` 836.618 ns 1.545 μs ×1.8
`inv(SS)` 318.336 ns 1.654 μs ×5.2

The benchmarks are generated by `runbenchmarks.jl` on the following system:

```julia> versioninfo()
Julia Version 1.6.0
Commit f9720dc2eb (2021-03-24 12:55 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin19.6.0)
CPU: Intel(R) Xeon(R) W-2150B CPU @ 3.00GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-11.0.1 (ORCJIT, skylake-avx512)
```

## Installation

`pkg> add Tensorial`

## Cheat Sheet

```# identity tensors
one(Tensor{Tuple{3,3}})            == Matrix(1I,3,3) # second-order identity tensor
one(Tensor{Tuple{@Symmetry{3,3}}}) == Matrix(1I,3,3) # symmetric second-order identity tensor
I  = one(Tensor{NTuple{4,3}})               # fourth-order identity tensor
Is = one(Tensor{NTuple{2, @Symmetry{3,3}}}) # symmetric fourth-order identity tensor

# zero tensors
zero(Tensor{Tuple{2,3}}) == zeros(2, 3)
zero(Tensor{Tuple{@Symmetry{3,3}}}) == zeros(3, 3)

# random tensors
rand(Tensor{Tuple{2,3}})
randn(Tensor{Tuple{2,3}})

# macros (same interface as StaticArrays.jl)
@Vec [1,2,3]
@Vec rand(4)
@Mat [1 2
3 4]
@Mat rand(4,4)
@Tensor rand(2,2,2)

# contraction and tensor product
x = rand(Mat{2,2})
y = rand(Tensor{Tuple{@Symmetry{2,2}}})
x ⊗ y isa Tensor{Tuple{2,2,@Symmetry{2,2}}} # tensor product
x ⋅ y isa Tensor{Tuple{2,2}}                # single contraction (x_ij * y_jk)
x ⊡ y isa Real                              # double contraction (x_ij * y_ij)

# norm/tr/mean/vol/dev
x = rand(SecondOrderTensor{3}) # equal to rand(Tensor{Tuple{3,3}})
v = rand(Vec{3})
norm(v)
tr(x)
mean(x) == tr(x) / 3 # useful for computing mean stress
vol(x) + dev(x) == x # decomposition into volumetric part and deviatoric part

# det/inv for 2nd-order tensor
A = rand(SecondOrderTensor{3})          # equal to one(Tensor{Tuple{3,3}})
S = rand(SymmetricSecondOrderTensor{3}) # equal to one(Tensor{Tuple{@Symmetry{3,3}}})
det(A); det(S)
inv(A) ⋅ A ≈ one(A)
inv(S) ⋅ S ≈ one(S)

# inv for 4th-order tensor
AA = rand(FourthOrderTensor{3})          # equal to one(Tensor{Tuple{3,3,3,3}})
SS = rand(SymmetricFourthOrderTensor{3}) # equal to one(Tensor{Tuple{@Symmetry{3,3}, @Symmetry{3,3}}})
inv(AA) ⊡ AA ≈ one(AA)
inv(SS) ⊡ SS ≈ one(SS)

# Einstein summation convention (experimental)
A = rand(Mat{3,3})
B = rand(Mat{3,3})
@einsum (i,j) -> A[i,k] * B[k,j]
@einsum A[i,j] * B[i,j]```

## Inspiration

### Used By Packages

No packages found.