Controlling Computer Features Through Hand Gesture

Controlling Computer Features Through Hand Gesture

C. V. Suresh Babu, J. Sivaneshwaran, Gokul Krishnan, Keerthi Varshaan, D. Anirudhan
DOI: 10.4018/978-1-6684-8938-3.ch005
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter introduces an AI-driven hand gesture recognition system designed to enhance computer settings control, prioritizing improved accessibility and user experiences. Leveraging machine learning algorithms trained on a dataset of relevant hand gestures (e.g., volume and brightness control), this project emphasizes data analysis for trend identification and system refinement. Successful outcomes could stimulate further research and innovation, potentially revolutionizing accessibility and user experience solutions. Ultimately, this endeavor aims to empower computer users with a more intuitive and accessible means of adjusting settings, contributing significantly to human-computer interaction advancement.
Chapter Preview
Top

2. Rationale Background

The rationale behind this project is rooted in the growing demand for user-friendly and intuitive interfaces to interact with technology. As technology becomes more pervasive, people increasingly expect devices to be intuitive and easy to use. Traditional interfaces like keyboards and touchpads can be cumbersome and slow, especially for simple tasks such as adjusting a laptop's brightness and volume settings.

Hand gesture recognition offers a promising solution to this issue, enabling users to interact with technology in a more natural and intuitive manner. While hand gesture recognition technology has been around for years and has found applications in gaming, healthcare, and the automotive industry, its application to control laptop functions, specifically brightness and volume settings, remains relatively unexplored. This project aims to fill that gap by developing a software-based solution capable of accurately and swiftly recognizing hand gestures to control a laptop's brightness and volume settings.

In this project, we have created a smart interaction between users and computers, allowing users to easily control computer features through hand motions. After users perform hand gestures, we utilize Python libraries like Mediapipe and OpenCV to collect information for tracking hand postures, landmarks, hand segments, and finger detection. This enables precise control and access to computer features, aligning with user intent.

The development of this technology offers significant potential benefits, including improved user productivity, reduced physical strain, and an enhanced user experience. By harnessing the capabilities of hand gesture recognition technology, this project seeks to provide an innovative solution that enhances the way users interact with their laptops. Moreover, this project holds implications for the broader field of human-computer interaction, showcasing how technology can adapt to meet users' needs and preferences.

Complete Chapter List

Search this Book:
Reset