Skip to content

A hand gesture to voice translation application made at UofT's Hackathon

Notifications You must be signed in to change notification settings

usman98789/UofTHacks-MyVoice

Repository files navigation

UofTHacks-MyVoice

The idea for this project was born while brainstorming another possible project idea that solely uses the Amazon Echo, but the idea of an accessibility product seemed to be more meaningful. We chose to create a hand gesture to speech project to help the mute to communicate more freely. We put ourselves in the shoes of a person who is mute and thought of examples of interactions that may require speech to communicate efficiently. These scenarios expressed how essential and useful this project is for many people. We call our project MyVoice since it truly does give a person a voice that uses a person’s hand instead of their mouth.

MyVoice can translate simple hand gestures to speech in order communicate with another person. We are using a Leap Motion sensor with its python API to track the hand gestures, python Text-to-Speech module to convert text to voice and the Amazon Echo to speak with a voice loud enough to be heard outside an average room. We create a website using python Flask Framework to present our ideas and demo video. We are not using American Sign Language (ASL) as our input gestures as it would take greater time and experience to make. It also was an issue for us to use ASL as some of the signs involve touching the face to express some words. This would require the use of additional API’s and adding additional time to the project timeline.

Link to our website https://my-view.herokuapp.com and demo video: https://youtu.be/du1fHo8R-w0

About

A hand gesture to voice translation application made at UofT's Hackathon

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages