A Gesture-Based System Puts AI in the Classroom

A Gesture-Based System Puts AI in the Classroom

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

For as long as classrooms have existed, students have dutifully crossed their “t’s” and dotted their “i’s” on paper using pens, pencils, or paint. But a group of researchers in Taiwan envision a very different approach to learning via their AI edge-computing program.

The platform allows students to draw their handwriting and artwork in midair with their fingers, while motion-tracking technology projects their writing onto a computer screen at the front of the classroom. The approach, described in a study published on 10 April in the IEEE Canadian Journal of Electrical and Computer Engineering, could help teachers more efficiently manage and teach large groups of students—for example, by the ability to see real-time results of multiple students’ work on a large screen simultaneously.

Liang-Bi Chen is an associate professor and chair of the department of computer science and information engineering at the National Penghu University of Science and Technology, in Magong City, Taiwan. He notes there are several different challenges that can hinder teaching efficiency in classrooms, especially as individual teachers become responsible for larger numbers of students.

“These [challenges] include limited interaction between teachers and students, and students struggling to understand lesson content in real time,” Chen says. Also, classroom supplies can be expensive, and sharing of classroom supplies can be unhygienic, he notes.

AI Tools for Education

The new system proposed by Chen and his colleagues could help address all of these issues. Each student is provided with a device that has a screen and webcam, the latter of which is able to track in detail 21 different joint points of the hand via the gesture-tracking library of Mediapipe. An AI model identifies specific hand gestures and recognizes when users want to switch modes (for example, switching from writing to selecting a color). An individual student’s work appears on the screen in front of them, but it can also be transmitted to the larger screen at the front of the classroom.

Students raise a fingerto enter “drawing mode” and clench their fist when they want to select a different color. If the system detects finger movement toward the virtual menu area at the edge of the canvas, it triggers menu operations. As each student makes hand motions in midair, their virtual writing or art is projected onto a large screen at the front of the classroom.

IEEE Spectrum/YouTube

“The teacher receives real-time video and drawing results from each student, which are integrated and displayed on a large screen to facilitate synchronized teaching and interaction,” explains Chen, adding that the students’ writing results are automatically uploaded to a cloud platform, allowing their handwriting to be evaluated after class as well.

In…

Read full article: A Gesture-Based System Puts AI in the Classroom

The post “A Gesture-Based System Puts AI in the Classroom” by Michelle Hampson was published on 06/05/2025 by spectrum.ieee.org