Abstract
The Sign Language is a way that is used for communication by people with inability to speak and hear. Thus, improving these languages is recognized as being widely influential across society. Worldwide, about 7,000 sign languages are used for communication, and many studies have been performed using different sign languages. This study considers American Sign Language (ASL) due to its popularity. We have proposed an efficient model using deep learning for 26 alphabets hand gestures in ASL to communicate with people. The proposed model has been assessed with the benchmark dataset and compared with many studies using the same datasets. It has achieved the highest accuracy as compared to the contemporary model. It has been observed that working with complex numbers positively impacted performance by approximately 20% compared to configuring our model to work with real numbers while keeping its structure intact.
Original language | English |
---|---|
Title of host publication | 2024 International Conference on Decision Aid Sciences and Applications (DASA) |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 1-5 |
Number of pages | 5 |
ISBN (Electronic) | 9798350369106 |
ISBN (Print) | 979-8-3503-6911-3 |
DOIs | |
Publication status | Published - 17 Jan 2025 |
Event | 2024 International Conference on Decision Aid Sciences and Applications (DASA) - Manama, Bahrain Duration: 11 Dec 2024 → 12 Dec 2024 |
Conference
Conference | 2024 International Conference on Decision Aid Sciences and Applications (DASA) |
---|---|
Country/Territory | Bahrain |
City | Manama |
Period | 11/12/24 → 12/12/24 |
Keywords
- American Sign Language (ASL)
- Artificial Intelligence (AI)
- Deep Learning
- Feature Extraction