TY - GEN
T1 - An Application of Machine Learning and Image Processing to Automatically Detect Teachers’ Gestures
AU - Hernández Correa, Josefina
AU - Farsani, Danyal
AU - Araya, Roberto
N1 - Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Providing teachers with detailed feedback about their gesticulation in class requires either one-on-one expert coaching, or highly trained observers to hand code classroom recordings. These methods are time consuming, expensive and require considerable human expertise, making them very difficult to scale to large numbers of teachers. Applying Machine Learning and Image processing we develop a non-invasive detector of teachers’ gestures. We use a multi-stage approach for the spotting task. Lessons recorded with a standard camera are processed offline with the OpenPose software. Next, using a gesture classifier trained on a previous training set with Machine Learning, we found that on new lessons the precision rate is between 54 and 78%. The accuracy depends on the training and testing datasets that are used. Thus, we found that using an accessible, non-invasive and inexpensive automatic gesture recognition methodology, an automatic lesson observation tool can be implemented that will detect possible teachers’ gestures. Combined with other technologies, like speech recognition and text mining of the teacher discourse, a powerful and practical tool can be offered to provide private and timely feedback to teachers about communication features of their teaching practices.
AB - Providing teachers with detailed feedback about their gesticulation in class requires either one-on-one expert coaching, or highly trained observers to hand code classroom recordings. These methods are time consuming, expensive and require considerable human expertise, making them very difficult to scale to large numbers of teachers. Applying Machine Learning and Image processing we develop a non-invasive detector of teachers’ gestures. We use a multi-stage approach for the spotting task. Lessons recorded with a standard camera are processed offline with the OpenPose software. Next, using a gesture classifier trained on a previous training set with Machine Learning, we found that on new lessons the precision rate is between 54 and 78%. The accuracy depends on the training and testing datasets that are used. Thus, we found that using an accessible, non-invasive and inexpensive automatic gesture recognition methodology, an automatic lesson observation tool can be implemented that will detect possible teachers’ gestures. Combined with other technologies, like speech recognition and text mining of the teacher discourse, a powerful and practical tool can be offered to provide private and timely feedback to teachers about communication features of their teaching practices.
KW - Automatic teacher´s gesture detection
KW - Intelligent image processing
KW - Machine learning
KW - Open pose
KW - Pattern recognition
UR - http://www.scopus.com/inward/record.url?scp=85097075474&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-63119-2_42
DO - 10.1007/978-3-030-63119-2_42
M3 - Contribución a la conferencia
AN - SCOPUS:85097075474
SN - 9783030631185
T3 - Communications in Computer and Information Science
SP - 516
EP - 528
BT - Advances in Computational Collective Intelligence - 12th International Conference, ICCCI 2020, Proceedings
A2 - Hernes, Marcin
A2 - Wojtkiewicz, Krystian
A2 - Szczerbicki, Edward
PB - Springer Science and Business Media Deutschland GmbH
T2 - 12th International Conference on International Conference on Computational Collective Intelligence, ICCCI 2020
Y2 - 30 November 2020 through 3 December 2020
ER -