Abstract:
Facial expression is one of the measurement metrics in teacher assessment because facial expression is a non-verbal aspect
of communication, and communication is an important aspect of teaching. However, teacher assessment has never used a mobile
application with facial expression recognition. Our research aims to develop a mobile facial expression recognition application for
teacher assessment measurements with optimum inference time. The first step of our research was to obtain the Jonathan Oheix face
expression recognition dataset from Kaggle, which has seven labels: ‘angry,’ ‘disgust,’ ‘fear,’ ‘happy,’ ‘neutral,’ ‘sad,’ and
‘surprise.’ This dataset is used with the ResNet-50 model for facial expression recognition. We have two comparison models, which
are shallow learning methods, namely k-Nearest Neighbor (KNN) and Support Vector Machine (SVM); then, two other comparison
models are pre-trained deep learning methods: MobileNetv2 and SE-ResNet-50. The metrics we compare are accuracy, inference
time, and frame rate. The test results show that fear has the best recall value and neutral has the worst. Then, disgust has the best
precision value, while fear has the worst. Happy is the label with the best F1-score with a value of 0.56. Compared with the SVM,
KNN, SE-ResNet-50, and MobileNetV2 methods, ResNet-50 is the model with the best accuracy, 0.5314. ResNet-50 has a worse
inference time and frame rate than MobileNetV2. However, the ResNet-50 frame rate of 946 fps is still above the frame rate
considered good, namely 15 fps. Our research is the first facial expression recognition in teacher assessment that uses the ResNet-50
model on the Jonathan Oheix dataset and has a mobile application