nvestigating the Impact of ReLUand Sigmoid Activation Functions on Animal Classification UsingCNN Models
Dublin Core
Title
nvestigating the Impact of ReLUand Sigmoid Activation Functions on Animal Classification UsingCNN Models
Subject
convolutional neural network; activation function; sigmoid; relu; classification;images
Description
VGG16 is a convolutional neural network model used for image recognition. It is unique in that it only has 16 weighted layers, rather than relying on a large number of hyperparameters.It is considered one of the best vision model architectures. However, several things needto be improved to increase the accuracy of image recognition. In this context, this work proposes and investigates two ensemble CNNs using transfer learning and compares them with state-of-the-art CNN architectures. This studycompares the performance of (rectified linear unit) ReLUand sigmoidactivation functions on CNN models for animal classification. To choose which model to use, we tested twostate-of-the-art CNN architectures: the default VGG16 with the proposed method VGG16. A dataset consisting of 2,000 images of five different animals was used. The results show that ReLU achieves ahigher classification accuracy than sigmoid. The model with ReLU infully connected and convolutionallayers achieved the highest precisionof 97.56% inthe test dataset.The research aims to findbetter activation functions and identify factors that influence model performance. The datasetconsists of animal images collected from Kaggle, including cats, cows, elephants, horses, and sheep. It is divided into training setsand test sets (ratio 80:20). The CNN model has two convolution layers and two fully connected layers. ReLU and sigmoid activation functions with different learning rates are used. Evaluation metrics include accuracy, precision, recall, F1 score, and test cost. ReLU outperforms sigmoid in accuracy, precision, recall, and F1 score.This study emphasizesthe importance of choosing the right activation function for better classification accuracy. ReLU is identified as effective in solving the vanish-gradientproblem. These findings can guide future research to improve CNN models in animal classification
Creator
Mesran1, Sitti Rachmawati Yahya2, Fifto Nugroho3, Agus Perdana Windarto
Source
https://jurnal.iaii.or.id/index.php/RESTI/article/view/5367/898
Publisher
Universitas Budi Darma, Medan, Indonesia2Universitas SiberAsia (UNSIA), Jakarta, Indonesia3Universitas Bung Karno, Jakarta, Indonesia4STIKOM Tunas Bangsa, Pematangsiantar, Indonesia
Date
18-02-2024
Contributor
FAJAR BAGUS W
Format
PDF
Language
ENGLISH
Type
TEXT
Files
Collection
Citation
Mesran1, Sitti Rachmawati Yahya2, Fifto Nugroho3, Agus Perdana Windarto, “nvestigating the Impact of ReLUand Sigmoid Activation Functions on Animal Classification UsingCNN Models,” Repository Horizon University Indonesia, accessed January 12, 2026, https://repository.horizon.ac.id/items/show/10227.