Improving Convolutional Neural Network (CNN) architecture (miniVGGNet) with Batch Normalization and Learning Rate Decay Factor for Image Classification

  • Asmida Ismail Universiti Putra Malaysia
  • Siti Anom Ahmad Universiti Putra Malaysia
  • Azura Che Soh Universiti Putra Malaysia
  • Khair Hassan Universiti Putra Malaysia
  • Hazreen Haizi Harith Universiti Putra Malaysia
Keywords: Convolutional Neural Network, deep learning, MiniVGGNet, hyper parameter

Abstract

The image classification is a classical problem of image processing, computer vision, and machine learning. This paper presents an analysis of the performance using Convolutional Neural Network (CNN) for image classifying using deep learning. MiniVGGNet is CNN architecture used in this paper to train a network for image classification, and CIFAR-10 is selected dataset used for this purpose. The performance of the network was improved by hyper parameter tuning techniques using batch normalization and learning rate decay factor. This paper compares the performance of the trained network by adding batch normalization layer and adjusting the value of learning rate decay factor for the network architecture. Based on the experimental results, adding batch normalization layer allow the networks to improve classification accuracy from 80% to 82%. Applying learning rate decay factor will improve classification accuracy to 83% and reduce the effects of overfitting in learning plot. Performance analysis shows that applying hyper parameter tuning can improve the performance of the network and increasing the ability of the model to generalize.

Downloads

Download data is not yet available.
Published
05-09-2019
How to Cite
Ismail, A., Ahmad, S. A., Che Soh, A., Hassan, K., & Harith, H. H. (2019). Improving Convolutional Neural Network (CNN) architecture (miniVGGNet) with Batch Normalization and Learning Rate Decay Factor for Image Classification. International Journal of Integrated Engineering, 11(4). Retrieved from https://publisher.uthm.edu.my/ojs/index.php/ijie/article/view/4558