Education, Science, Technology, Innovation and Life
Open Access
Sign In

Analysis of Spatiotemporal Characteristics of Student Concentration Based on Emotion Evolution

Download as PDF

DOI: 10.23977/acss.2023.070711 | Downloads: 7 | Views: 359

Author(s)

Tieming Xiang 1, Hongjian Ji 1, Jing Sheng 1

Affiliation(s)

1 Xiamen University of Technology, Xiamen, China

Corresponding Author

Hongjian Ji

ABSTRACT

Detecting the concentration of students in the classroom can help teachers quickly understand the participation and activity of students. However, the concentration of students has complex spatiotemporal distribution and evolution laws, which is challenging to identify and quantify. This paper proposes a novel student concentration evaluation method based on emotional evolution and virus transmission, which analyzes the spatiotemporal characteristics of concentration. The research contents are as follows: (1) A visual emotion classification method based on deep learning algorithm is developed to identify and quantify the emotion changes of each student. (2) On the basis of quantification results of emotion, the concentration index model with introducing the theory of virus transmission is established and further used to explore the spread of student concentration in spatiotemporal dimensions. (3) The Wilcoxon rank sum test (RST) is used to verify the difference of the results calculated by concentration index model in different semesters, and the reliability of the model can be reflected by the Pearson correlation coefficient between the centroid of the spatiotemporal distribution of concentration and final exam results. The experiments of 64 offline courses have been carried out in a same class for two semesters, and the results show that the concentration of student in the spatial dimension can be affected by negative and positive emotions from different regions, while in the temporal dimension, the high concentration level will decrease with increase of course time, and the generation speed of this phenomenon will be further exacerbated after coupling the spatial factors.

KEYWORDS

Emotion recognition; spatiotemporal characteristics; virus transmission theory; teaching; computer vision

CITE THIS PAPER

Tieming Xiang, Hongjian Ji, Jing Sheng, Analysis of Spatiotemporal Characteristics of Student Concentration Based on Emotion Evolution. Advances in Computer, Signals and Systems (2023) Vol. 7: 89-102. DOI: http://dx.doi.org/10.23977/acss.2023.070711.

REFERENCES

[1] Lim S., Yeo M., & Yoon G. (2019). Comparison between concentration and immersion based on EEG analysis. Sensors, 19, 7, 1669. 
[2] Noroozi F., Corneanu C. Kamińska, D. Sapinski T. Escalera S. & Anbarjafari G. (2018). Survey on emotional body gesture recognition. IEEE transactions on affective computing, 12, 2, 505-523. 
[3] Kar P., Chattopadhyay S., & Chakraborty S. (2020). Gestatten: Estimation of User's Attention in Mobile MOOCs From Eye Gaze and Gaze Gesture Tracking. Proceedings of the ACM on Human-Computer Interaction, 4(EICS), 1-32. 
[4] Guo X., Zhou J., & Xu T. (2018). Evaluation of teaching effectiveness based on classroom micro-expression recognition. International Journal of Performability Engineering, 14, 11, 2877. 
[5] Kulik J. A., & Fletcher J. D. (2016). Effectiveness of intelligent tutoring systems: a meta-analytic review. Review of educational research, 86, 1, 42-78. 
[6] Singh A., Karanam S., & Kumar D. (2013). Constructive learning for human-robot interaction. IEEE Potentials, 32, 4, 13-19. 
[7] Li J., Shi D., Tumnark P., & Xu H. (2020). A system for real-time intervention in negative emotional contagion in a smart classroom deployed under edge computing service infrastructure. Peer-to-Peer Networking and Applications, 13, 1706-1719. 
[8] D' Mello S., Picard, R. W., & Graesser A. (2007). Toward an affect-sensitive AutoTutor. IEEE Intelligent Systems, 22, 4, 53-61. 
[9] Gu J. X., Wang Z. H., Kuen J., Ma L. Y., Shahroudy A., Shuai B., Liu T., Wang X. X., Wang G., Cai J. F., & Chen T. H. (2018). Recent advances in convolutional neural networks. Pattern recognition, 77, 354-377. 
[10] Gibson J. J. (1951). The Perception of the Visual World. The American Journal of Psychology, 64, 3, 43-55. 
[11] Lis A. M., Black K. M., Korn H., & Nordin M. (2007). Association between sitting and occupational LBP. European spine journal, 16, 2, 283-298. 
[12] Greff K., Srivastava R. K., Koutník J., Steunebrink B. R., & Schmidhuber J. (2016). LSTM: A search space odyssey. IEEE transactions on neural networks and learning systems, 28, 10, 2222-2232. 
[13] Monkaresi H., Bosch N., Calvo R. A., & D'Mello S. K. (2016). Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Transactions on Affective Computing, 8, 1, 15-28. 
[14] Kim J. K., & Kang Y. (2016). The effects of smart-phone use experience on emotion regulation and attention concentration of young children. Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology, 6, 4, 97-109. 
[15] Olausson D., & Vallmark J. J. (2014). Emotion, concentration and performance in a Swedish female handball team: Development and implementation of the team specific instrument. 24, 11, 33-39. 
[16] Yao K. C., Huang W. T., Chen T. Y., Wu C. C., & Ho W. S. (2022). Establishing an intelligent emotion analysis system for long-term care application based on LabVIEW. Sustainability, 14, 14, 8932. 
[17] Ko B. C. (2018). A brief review of facial emotion recognition based on visual information. Sensors, 18, 2, 401. 
[18] Šuvakov M., Garci A. D., Schweitzer F., & Tadić B. (2012). Agent-based simulations of emotion spreading in online social networks. arXiv preprint arXiv, 1205. 6278. 
[19] Larison K. D. (2022). On Beyond Constructivism: Using Intersubjective Approaches to Promote Learning in the Science Classroom. Science & Education, 31, 1, 213-239. 
[20] Teixeira T., Wedel M., & Pieters R. (2012). Emotion-induced engagement in internet video advertisements. Journal of marketing research, 49, 2, 144-159. 
[21] Zhang K., Zhang Z., Li Z., & Qiao Y. (2016). Joint face detection and alignment using multitask cascaded convolutional networks. IEEE signal processing letters, 23, 10, 1499-1503. 
[22] Wang Z., Cui S., Wang X., & Tian J. F. (2022). Research on Facial Expression Capture Based on Two-Stage Neural Network. Cmc-computers Materials & Coutinua, 72, 3, 4709-4725. 
[23] Schroff F., Kalenichenko, D. & Philbin J. (2015). Facenet: A unified embedding for face recognition and clustering. Proceedings of the IEEE conference on computer vision and pattern recognition. 815-823. 
[24] Ekman P. (1992). An argument for basic emotions. Cognition & emotion, 6, 3-4, 169-200. 
[25] Russell J. A., & Mehrabian A. (1977). Evidence for a three-factor theory of emotions. Journal of research in Personality, 11, 3, 273-294. 
[26] Nicolaou M. A., Gunes H., & Pantic M. (2011). Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. IEEE Transactions on Affective Computing, 2, 2, 92-105. 
[27] He K., Zhang X., Ren S., & Sun J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition. 770-778. 
[28] Khaireddin Y., & Chen Z. (2021). Facial emotion recognition: State of the art performance on FER2013. arXiv preprint arXiv, 2105. 03588. 
[29] Kermack W. O., & McKendrick A. G. (1991). Contributions to the mathematical theory of epidemics--I. 1927. Bulletin of mathematical biology, 53, 1-2, 33-55. 
[30] Grenfell B., & Harwood J. (1997). (Meta) population dynamics of infectious diseases. Trends in ecology & evolution, 12, 10, 395-399. 
[31] Wallinga J., Edmunds, W. J., & Kretzschmar M. (1999). Perspective: human contact patterns and the spread of airborne infectious diseases. Trends in Microbiology, 7, 9, 372-377. 
[32] Liljeros F., Edling C. R., & Amaral L. A. N. (2003). Sexual networks: implications for the transmission of sexually transmitted infections. Microbes and infection, 5, 2, 189-196. 
[33] Feng L. P., Wang H. B., & Feng S. Q. (2011). Computer Network Virus Propagation Model Based on Biology Principle. Jisuanji Gongcheng/ Computer Engineering, 37, 11, 298-321. 
[34] De Winter, J. C. F. (2013). Using the Student's t-test with extremely small sample sizes. Practical Assessment Research and Evaluation, 18, 1, 10-12. 
[35] Dewan I., & Rao B. L. S. P. (2005). Wilcoxon-signed rank test for associated sequences. Statistics & probability letters, 71, 2, 131-142. 
[36] Perolat J., Couso I., Loquin K., & Straussa O. (2015). Generalizing the Wilcoxon rank-sum test for interval data. International Journal of Approximate Reasoning, 56, 46, 108-121.

Downloads: 15129
Visits: 267912

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.