# Tensorial

*Statically sized tensors and related operations for Julia*

Tensorial provides useful tensor operations (e.g., contraction; tensor product, `⊗`

; `inv`

; etc.) written in the Julia programming language.
The library supports arbitrary size of non-symmetric and symmetric tensors, where symmetries should be specified to avoid wasteful duplicate computations.
The way to give a size of the tensor is similar to StaticArrays.jl, and symmetries of tensors can be specified by using `@Symmetry`

.
For example, symmetric fourth-order tensor (symmetrizing tensor) is represented in this library as `Tensor{Tuple{@Symmetry{3,3}, @Symmetry{3,3}}}`

.
Einstein summation macro and automatic differentiation functions are also provided.

## Speed

```
a = rand(Vec{3}) # vector of length 3
A = rand(SecondOrderTensor{3}) # 3x3 second order tensor
S = rand(SymmetricSecondOrderTensor{3}) # 3x3 symmetric second order tensor
B = rand(Tensor{Tuple{3,3,3}}) # 3x3x3 third order tensor
AA = rand(FourthOrderTensor{3}) # 3x3x3x3 fourth order tensor
SS = rand(SymmetricFourthOrderTensor{3}) # 3x3x3x3 symmetric fourth order tensor (symmetrizing tensor)
```

See here for above aliases.

Operation | `Tensor` |
`Array` |
speed-up |
---|---|---|---|

Single contraction |
|||

`a ⋅ a` |
1.537 ns | 16.943 ns | ×11.0 |

`A ⋅ a` |
1.538 ns | 80.647 ns | ×52.4 |

`S ⋅ a` |
1.545 ns | 79.630 ns | ×51.5 |

Double contraction |
|||

`A ⊡ A` |
2.730 ns | 17.909 ns | ×6.6 |

`S ⊡ S` |
1.704 ns | 19.099 ns | ×11.2 |

`B ⊡ A` |
4.886 ns | 183.789 ns | ×37.6 |

`AA ⊡ A` |
7.035 ns | 193.607 ns | ×27.5 |

`SS ⊡ S` |
3.589 ns | 192.727 ns | ×53.7 |

Tensor product |
|||

`a ⊗ a` |
2.035 ns | 53.872 ns | ×26.5 |

Cross product |
|||

`a × a` |
2.035 ns | 53.872 ns | ×26.5 |

Determinant |
|||

`det(A)` |
1.541 ns | 223.537 ns | ×145.1 |

`det(S)` |
1.542 ns | 227.196 ns | ×147.3 |

Inverse |
|||

`inv(A)` |
5.432 ns | 544.122 ns | ×100.2 |

`inv(S)` |
3.872 ns | 552.627 ns | ×142.7 |

`inv(AA)` |
854.691 ns | 1.727 μs | ×2.0 |

`inv(SS)` |
310.218 ns | 1.728 μs | ×5.6 |

The benchmarks are generated by
`runbenchmarks.jl`

on the following system:

```
julia> versioninfo()
Julia Version 1.6.3
Commit ae8452a9e0 (2021-09-23 17:34 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin19.5.0)
CPU: Intel(R) Core(TM) i7-7567U CPU @ 3.50GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-11.0.1 (ORCJIT, skylake)
```

## Installation

`pkg> add Tensorial`

## Cheat Sheet

```
# identity tensors
one(Tensor{Tuple{3,3}}) == Matrix(1I,3,3) # second-order identity tensor
one(Tensor{Tuple{@Symmetry{3,3}}}) == Matrix(1I,3,3) # symmetric second-order identity tensor
I = one(Tensor{NTuple{4,3}}) # fourth-order identity tensor
Is = one(Tensor{NTuple{2, @Symmetry{3,3}}}) # symmetric fourth-order identity tensor
# zero tensors
zero(Tensor{Tuple{2,3}}) == zeros(2, 3)
zero(Tensor{Tuple{@Symmetry{3,3}}}) == zeros(3, 3)
# random tensors
rand(Tensor{Tuple{2,3}})
randn(Tensor{Tuple{2,3}})
# macros (same interface as StaticArrays.jl)
@Vec [1,2,3]
@Vec rand(4)
@Mat [1 2
3 4]
@Mat rand(4,4)
@Tensor rand(2,2,2)
# statically sized getindex by `@Tensor`
x = @Mat [1 2
3 4
5 6]
@Tensor(x[2:3, :]) == @Mat [3 4
5 6]
@Tensor(x[[1,3], :]) == @Mat [1 2
5 6]
# contraction and tensor product
x = rand(Mat{2,2})
y = rand(Tensor{Tuple{@Symmetry{2,2}}})
x ⊗ y isa Tensor{Tuple{2,2,@Symmetry{2,2}}} # tensor product
x ⋅ y isa Tensor{Tuple{2,2}} # single contraction (x_ij * y_jk)
x ⊡ y isa Real # double contraction (x_ij * y_ij)
# det/inv for 2nd-order tensor
A = rand(SecondOrderTensor{3}) # equal to one(Tensor{Tuple{3,3}})
S = rand(SymmetricSecondOrderTensor{3}) # equal to one(Tensor{Tuple{@Symmetry{3,3}}})
det(A); det(S)
inv(A) ⋅ A ≈ one(A)
inv(S) ⋅ S ≈ one(S)
# inv for 4th-order tensor
AA = rand(FourthOrderTensor{3}) # equal to one(Tensor{Tuple{3,3,3,3}})
SS = rand(SymmetricFourthOrderTensor{3}) # equal to one(Tensor{Tuple{@Symmetry{3,3}, @Symmetry{3,3}}})
inv(AA) ⊡ AA ≈ one(AA)
inv(SS) ⊡ SS ≈ one(SS)
# Einstein summation convention
A = rand(Mat{3,3})
B = rand(Mat{3,3})
@einsum (i,j) -> A[i,k] * B[k,j]
@einsum A[i,j] * B[i,j]
# Automatic differentiation
gradient(tr, rand(Mat{3,3})) == one(Mat{3,3}) # Tensor -> Real
gradient(identity, rand(SymmetricSecondOrderTensor{3})) == one(SymmetricFourthOrderTensor{3}) # Tensor -> Tensor
```