Complex neural network examples for Flux.jl.
This package contains a loose collection of (slightly) more advanced neural network architectures, mostly centered around time series forecasting.
To install FluxArchitectures, type ]
to activate the package manager, and type
add FluxArchitectures
for installation. After using FluxArchitectures
, the following functions are exported:
prepare_data
get_data
DARNN
DSANet
LSTnet
TPALSTM
See their docstrings, the documentation, and the examples
folder for details.
-
LSTnet: This "Long- and Short-term Time-series network" follows the paper by Lai et. al..
-
DARNN: The "Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction" is based on the paper by Qin et. al..
-
TPA-LSTM: The Temporal Pattern Attention LSTM network is based on the paper "Temporal Pattern Attention for Multivariate Time Series Forecasting" by Shih et. al..
-
DSANet: The "Dual Self-Attention Network for Multivariate Time Series Forecasting" is based on the paper by Siteng Huang et. al.
Activate the package and load some sample-data:
using FluxArchitectures
poollength = 10; horizon = 15; datalength = 1000;
input, target = get_data(:exchange_rate, poollength, datalength, horizon)
Define a model and a loss function:
model = LSTnet(size(input, 1), 2, 3, poollength, 120)
loss(x, y) = Flux.mse(model(x), y')
Train the model:
Flux.train!(loss, Flux.params(model),Iterators.repeated((input, target), 20), Adam(0.01))