Lasagne Versions Save

Lightweight library to build and train neural networks in Theano

v0.1

8 years ago
  • core contributors, in alphabetical order:
    • Eric Battenberg (@ebattenberg)
    • Sander Dieleman (@benanne)
    • Daniel Nouri (@dnouri)
    • Eben Olson (@ebenolson)
    • Aäron van den Oord (@avdnoord)
    • Colin Raffel (@craffel)
    • Jan Schlüter (@f0k)
    • Søren Kaae Sønderby (@skaae)
  • extra contributors, in chronological order:
    • Daniel Maturana (@dimatura): documentation, cuDNN layers, LRN
    • Jonas Degrave (@317070): get_all_param_values() fix
    • Jack Kelly (@JackKelly): help with recurrent layers
    • Gábor Takács (@takacsg84): support broadcastable parameters in lasagne.updates
    • Diogo Moitinho de Almeida (@diogo149): MNIST example fixes
    • Brian McFee (@bmcfee): MaxPool2DLayer fix
    • Martin Thoma (@MartinThoma): documentation
    • Jeffrey De Fauw (@JeffreyDF): documentation, ADAM fix
    • Michael Heilman (@mheilman): NonlinearityLayer, lasagne.random
    • Gregory Sanders (@instagibbs): documentation fix
    • Jon Crall (@erotemic): check for non-positive input shapes
    • Hendrik Weideman (@hjweide): set_all_param_values() test, MaxPool2DCCLayer fix
    • Kashif Rasul (@kashif): ADAM simplification
    • Peter de Rivaz (@peterderivaz): documentation fix