NUST Institutional Repository

Computer Vision & EMG Based Prosthetic Hand with Haptic Feedback

Show simple item record

dc.contributor.author Supervisor Dr. Usman Akram, Imaan Shahid Syed Ali John Naqvi Hassan Ashraf Bilal Yousaf
dc.date.accessioned 2024-05-10T12:10:27Z
dc.date.available 2024-05-10T12:10:27Z
dc.date.issued 2023
dc.identifier.other DE-COMP-41
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/43294
dc.description Supervisor Dr. Usman Akram en_US
dc.description.abstract The rise in upper limb amputations has led to a major growth in the use of prosthetic hands throughout the world. Amputations can result from a variety of disorders, including cancer, diabetes, congenital deformities, infections, and vascular diseases, as well as severe injuries from accidents at work or in cars. Amputations can also be the result of disagreements and accidents at work in some areas. Prosthetic hands are recommended to address these issues, however many of the current models have frustratingly complex control schemes, scant sensory feedback, and unnatural movement patterns that drive users away. In this capstone project, we offer a thorough solution that makes use of various control systems to operate prosthetic hands. In the first control approach, hand movements are interpreted by a camera and computer vision algorithms to identify the appropriate grip pattern. This enables gesture-based control. The second control method makes use of electromyography (EMG) signals produced by remaining muscles in the amputated limb. Surface electrodes are used to record and analyze these signals, which allows to precisely control grip patterns. As a third control option, a user-friendly mobile application is also created that enables people to manually select and modify grip patterns in accordance with their preferences. This project offers a flexible and user-centric approach to prosthetic hand control, lowering the learning curve and improving usability for people with upper limb amputations, enabling them to carry out daily tasks with greater ease and efficiency. It does this by integrating camera-based control, EMG signal analysis, and a mobile app interface. The mechanical design of the prosthetic hand, which has six degrees of flexibility, was greatly improved in this research. A significant improvement was made to the thumb, which had previously presented a problem because of its size and made it difficult for it to fit inside a glove. A new mechanism for the thumb was created with the goal of minimizing its size while preserving functionality to overcome this limitation. The user’s comfort and usefulness are improved because to this redesign, which enables seamless integration of the 6 degrees of freedom prosthetic hand with a glove. A haptic feedback system was a crucial addition to the prosthetic hand system. By giving users a feeling of touch, this device hopes to improve their entire experience and the prosthetic hand’s functionality. Users can experience tactile sensations and feedback from the hand using the haptic feedback technology, simulating the sense of touch. The user’s capacity to interact with items is significantly improved by this breakthrough because they can now feel texture, pressure, and other tactile information. The prosthetic hand offers a more intuitive and realistic experience because to the addition of haptic feedback, enabling users to carry out a larger variety of activities and better navigate their surroundings. en_US
dc.language.iso en en_US
dc.publisher College of Electrical and Mechanical Engineering (CEME), NUST en_US
dc.title Computer Vision & EMG Based Prosthetic Hand with Haptic Feedback en_US
dc.type Project Report en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account