The American Sign Language (ASL) and British Sign Language (BSL) Gesture Recognition with Mediapipe and Neural Networks is a breakthrough project aimed at bridging communication gaps for individuals with hearing impairments. By combining the power of neural networks with the Mediapipe library, this application interprets and translates sign language gestures into text or speech. It serves as an inclusive tool, fostering communication accessibility and promoting understanding across different sign languages.
Software Requirements :-
1. Operating System: Win 7 or more
2. Programming Language: Python
Hardware Requirement :-
1. Processor-i3
2. Hard disk-5GB
3. Memory-2GB RAM