Smart Presentation Management Using Gesture Recognition and Machine Learning

Main Article Content

Ammar Aldallal, Qassim Abdali

Abstract

With the growing reliance on visual information, computer vision offers solutions to complex problems and enhances user experiences across multiple domains. This paper focuses on integrating hand gestures into software applications to improve human-computer interaction. The advancement of gesture recognition systems is vital for various applications, including gaming, virtual and augmented reality, assisted living, cognitive development assessment, and industrial uses like human-robot interaction and autonomous vehicle control. This research presents a system for controlling PowerPoint presentations using hand gestures detected through computer vision. Utilizing OpenCV and MediaPipe, the system analyzes video input from a webcam to track hand movements in real-time, converting specific gestures into commands. The proposed system offers an intuitive and engaging alternative to traditional input devices, allowing presenters to navigate slides, annotate content, control a pointer, and apply zoom using natural hand gestures. The implementation demonstrates the system's effectiveness in accurately recognizing hand gestures and triggering corresponding actions, validated through practical examples and high accuracy in various functions. This approach addresses the high cost of smart boards and the need for more interactive presentation tools, particularly in educational settings, enhancing engagement and accessibility.

Article Details

Section
Articles