Abstract:
This paper presents an empirical study on advanced Deep Neural Network (DNN) models, with a focus on identifying
potential baseline models for efficient deployment in resource-constrained environments (RCE). The systematic evaluation
encompasses ten state-of-the-art pre-trained DNN models: ResNet50, InceptionResNetV2, InceptionV3, MobileNet, MobileNetV2,
EfficientNetB0, EfficientNetB1, EfficientNetB2, DenseNet121, and Xception, within the context of an RCE setting. Evaluation
criteria, such as parameters (indicating model complexity), storage space (reflecting storage requirements), CPU usage time (for realtime
applications), and accuracy (reflecting prediction truth), are considered through systematic experimental procedures. The results
highlight MobileNet's excellent trade-off between accuracy and resource requirements, especially in terms of CPU and storage
consumption, in experimental scenarios where image predictions are performed on an RCE device. Consequently, MobileNet emerges
as a suitable baseline model for future DNNs developed specifically for RCE image classification. The study's conclusions endorse
MobileNet as a baseline model for transfer learning techniques (used in DNN design), providing valuable insights for optimizing DNN
models in resource-constrained scenarios. This approach enhances the creation of efficiency-focused and lightweight DNN models,
improving their application and efficacy in resource-constrained environments. Future research will leverage the identified MobileNet
model as a foundation to create a new DNN model tailored for efficiency-driven image classification applications in RCE devices.