KnetLayers.jl

Useful Layers for Knet
Author ekinakyurek
Popularity
21 Stars
Updated Last
1 Year Ago
Started In
September 2018

KnetLayers

codecov

KnetLayers provides usefull deep learning layers for Knet, fostering your model development. You are able to use Knet and AutoGrad functionalities without adding them to current workspace.

Overview

model = Chain(Dense(input=768, output=128, activation=Sigm()),
	      Dense(input=128, output=10, activation=nothing))

loss(model, x, y) = nll(model(x), y)

Getting Started: Train an MNIST model

using Knet, KnetLayers
import Knet: Data
#Data
include(Knet.dir("data","mnist.jl"))
dtrn,dtst = mnistdata(xsize=(784,:)); # dtrn and dtst = [ (x1,y1), (x2,y2), ... ] where xi,yi are

#Model
HIDDEN_SIZES = [100,50]
(m::MLP)(x,y) = nll(m(x),y)
(m::MLP)(d::Data) = mean(m(x,y) for (x,y) in d)
model = MLP(784,HIDDEN_SIZES...,10)

#Train
EPOCH=10
progress!(sgd(model,repeat(dtrn,EPOCH)))

#Test
@show 100accuracy(model, dtst)

Example Models

  1. MNIST-MLP

  2. MNIST-CNN

  3. GAN-MLP

  4. ResNet: Residual Networks for Image Recognition

  5. S2S: Sequence to Sequence Reccurent Model

  6. Morse.jl: Morphological Analyzer+Lemmatizer

  7. MAC Network: Memory-Attention-Composition Network for Visual Question Answering

Example Layers and Usage

using KnetLayers

#Instantiate an MLP model with random parameters
mlp = MLP(100,50,20; activation=Sigm()) # input size=100, hidden=50 and output=20

#Do a prediction with the mlp model
prediction = mlp(randn(Float32,100,1))

#Instantiate a convolutional layer with random parameters
cnn = Conv(height=3, width=3, inout=3=>10, padding=1, stride=1) # A conv layer

#Filter your input with the convolutional layer
output = cnn(randn(Float32,224,224,3,1))

#Instantiate an LSTM model
lstm = LSTM(input=100, hidden=100, embed=50)

#You can use integers to represent one-hot vectors.
#Each integer corresponds to vocabulary index of corresponding element in your data.

#For example a pass over 5-Length sequence
rnnoutput = lstm([3,2,1,4,5];hy=true,cy=true)

#After you get the output, you may acces to hidden states and
#intermediate hidden states produced by the lstm model
rnnoutput.y
rnnoutput.hidden
rnnoutput.memory

#You can also use normal array inputs for low-level control
#One iteration of LSTM with a random input
rnnoutput = lstm(randn(100,1);hy=true,cy=true)

#Pass over a random 10-length sequence:
rnnoutput = lstm(randn(100,1,10);hy=true,cy=true)

#Pass over a mini-batch data which includes unequal length sequences
rnnoutput = lstm([[1,2,3,4],[5,6]];sorted=true,hy=true,cy=true)

#To see and modify rnn params in a structured view
lstm.gatesview

TO-DO

  1. Examples
  2. Special layers such Google's inception
  3. Known embeddings such Gloove
  4. Pretrained Models