Android application which uses feature extraction algorithms and machine learning (SVM) to recognise and translate static sign language gestures.
The Sign Language app is an Android application which can translate static ASL and BSL signs, such as the fingerspelling alphabet. These translated signs can be displayed to the user whilst allowing for sentences to be constructed. This app is currently a proof of concept to illustrate low-cost, freely available and offline Sign Language recognition using purely visual data.
The current beta version of this app can be tested here:
(Click the image below to watch the video demo)
Clone the repo onto your local machine by using:
HTTPS: git clone https://github.com/Mquinn960/sign-language.git
SSH: git clone [email protected]:Mquinn960/sign-language.git
Ensure the prerequisites below are installed/satisfied
If you're using Android Studio, load the project and hit run
trained.xml
) in the sign-language\app\src\main\res\raw\
directory. This gets loaded when running the app the first time.trained.xml
here
trained.xml
file, which contains the Machine Learning information required to make predictions about your Sign Language gesturestrained.xml
file has been added to the app raw
resources folder as per the PrerequisitesIf you want to alter the Sign Language app and then use the Sign Language app's imaging kernel to train a new model with the Offline Trainer, you must first run the Grade make-jar
task.
sign-language\app\build.gradle
com.android.application
build stepcom.android.library
taskmake-jar
Gradle tasksign-language\app\build\outputs\jar\app-release-null.jar
and import this into the Offline Trainer's "new" folder, as described in the repo README
app
task again.This project is licensed under the MIT License - see the LICENSE.md file for details