GIMLeT – Gestural Interaction Machine Learning Toolkit
A set of Max patches for gesture analysis, interactive machine learning, and gesture-sound interaction design. GIMLeT features a modular design that allows to easily share meaningfully structured data between several gesture tracking devices, machine learning, and sound synthesis modules.
NOTE: the PoseNet implementation used in this package is now deprecated and therefore is likely not going to work. I don't think I will have time to fix that in the foreseeable future and therefore anyone's contrubution is welcome, see this issue.
odot
folder in your /Max 8/Packages
folder.modosc
folder in your /Max 8/Packages
folder.GIMLeT
folder in your /Max 8/Packages
folder.gimlet.ml.ann
module.Launch Max, click on Extras->"GIMLeT examples" on the menu bar, choose an example.
rapidmax : Max external for interactive machine learning
https://github.com/samparkewolfe/RapidMax (Mac)
https://github.com/MartinTownley/RapidMax_Windows
petra : Max package for granular synthesis
https://github.com/CircuitMusicLabs/petra
Gesture Variation Follower
https://github.com/bcaramiaux/ofxGVF
HfMT Optitrack OSC bridge (optional, if used with Optitrack Motive)
https://github.com/HfMT-ZM4/motion-tracking
A compiled build is available here, just edit the .bat file with the ip and port where you want to send the OSC data to and then run it: https://github.com/HfMT-ZM4/motion-tracking/releases/download/0.0.1/motion-osc.zip
Visi, F. G., & Tanaka, A. (2021). Interactive Machine Learning of Musical Gesture. In E. R. Miranda (Ed.), Handbook of Artificial Intelligence for Music: Foundations, Advanced Approaches, and Developments for Creativity. Springer, 2021.
Caramiaux, B., Montecchio, N., Tanaka, A., & Bevilacqua, F. (2014). Adaptive Gesture Recognition with Variation Estimation for Interactive Systems. ACM Transactions on Interactive Intelligent Systems, 4(4), 1–34. https://doi.org/10.1145/2643204
The project was initiated as a collaboration between Federico Visi and Hochschule für Musik und Theater Hamburg, Germany, within the framework of the KiSS: Kinetics in Sound and Space project.
gimlet.mangle
is based on a synth design by Atau Tanaka.
The data recorder in gimlet.ml.ann
is based on a design by Michael Zbyszyński.
Further development was carried out by FV as part of a postdoctoral research position at GEMM))) Gesture Embodiment and Machines in Music – Piteå School of Music – Luleå University of Technology, Sweden.
The package is being used and developed further in several projects including:
mail[at]federicovisi[dot]com