Header Ads Widget

Hand gestures recognized by system will expand computer input on a keyboard

Researchers are working on a new technique that allows computers to respond to directions via hand gestures.

The "Typealike" prototype utilises a standard laptop webcam with a simple attached mirror. The programme detects the user's hands near or beside the keyboard and prompts operations based on their location.

For example, if a user places their right hand near the keyboard with the thumb pointing up, the application will interpret this as a signal to increase the volume. A variety of gestures and combinations of gestures can be programmed to do a variety of tasks.

Human-computer interface (HCI) innovation strives to make user experiences faster and smoother, with less reliance on keyboard shortcuts or mouse and trackpad operations.

"It began with a simple notion about new methods to use a webcam," said Nalin Chhibber, a recent master's graduate from the Cheriton School of Computer Science at the University of Waterloo. "The webcam is focused at your face, but the majority of computer interaction revolves around your hands. So we wondered what we could do if the webcam could recognise hand motions."

The first insight resulted in the creation of a small mechanical attachment that directs the camera downwards and toward the hands. The team then developed a software programme that can recognise diverse hand gestures in a variety of situations and for different users. The Typealike algorithm was trained using machine learning techniques.

Fabrice Matulic, a senior researcher at Preferred Networks Inc. and a former postdoctoral researcher at Waterloo, explained, "It's a neural network, so you need to provide the algorithm samples of what you're trying to identify." "Because people make gestures differently and hands vary in size, you'll need to collect a lot of data from a variety of people in a variety of lighting circumstances."

With the help of dozens of research volunteers, the researchers created a database of hand motions. They also asked the volunteers to participate in tests and questionnaires to help the team figure out how to make the app more functional and versatile.

Daniel Vogel, an associate professor of computer science at Waterloo, said, "We're always trying out to develop products that people can simply utilise." "When people see Typealike or other breakthrough technology in the field of human-computer interaction, they comment, "It just makes sense." That's exactly what we're looking for. We want to create technology that is simple and intuitive, yet this often necessitates extensive study and complicated software."

The Typealike programme might be used in virtual reality to eliminate the need for hand-held controls, according to the researchers.

Post a Comment

0 Comments