BoW-based neural networks vs. cutting-edge models for single-label text classification

Document Type

Article

Publication Date

9-1-2023

Abstract

To reliably and accurately classify complicated ``big'' datasets, machine learning models must be continually improved. This research proposes straightforward yet competitive neural networks for text classification, even though graph neural networks (GNN) have reignited interest in graph-based text classification models. Convolutional neural networks (CNN), artificial neural networks (ANN), and their refined ``fine-tuned'' models (denoted as FT-CNN and FT-ANN) are the names given to our proposed models. The models presented in this paper demonstrate that our simple models like (CNN, ANN, FT-CNN, and FT-ANN) can perform better than more complex GNN ones such as (SGC, SSGC, and TextGCN) and are comparable to others (i.e., HyperGAT and Bert). The process of fine-tuning is also highly recommended because it improves the performance and reliability of models. The performance of our suggested models on five benchmark datasets (namely, Reuters (R8), R52, 20NewsGroup, Ohsumed, and Mr) is vividly illustrated. According to the experimental findings, on the majority of the target datasets, these models-especially those that have been fine-tuned-perform surprisingly better than SOTA approaches, including GNN-based models.

Keywords

Data mining, Text classification, Neural networks, Machine learning

Divisions

infosystem

Funders

Zayed University

Publication Title

Neural Computing and Applications

Volume

35

Issue

27

Publisher

Springe rNature

Publisher Location

236 GRAYS INN RD, 6TH FLOOR, LONDON WC1X 8HL, ENGLAND

This document is currently not available here.

Share

COinS