Posted: Oct 28, 2025
This paper addresses the critical challenge of class imbalance in machine learning classification tasks, which often leads to biased models favoring majority classes. We propose a novel adaptive ensemble framework that integrates hybrid sampling techniques with cost-sensitive learning to improve classification performance on imbalanced datasets. Our methodology combines synthetic minority oversampling (SMOTE) with edited nearest neighbor undersampling, dynamically adjusting sampling ratios based on dataset characteristics. The framework incorporates cost-sensitive decision trees and gradient boosting with class-weighted loss functions. Experimental evaluation on 15 real-world imbalanced datasets demonstrates that our approach achieves an average improvement of 18.7% in F1-score and 22.3% in G-mean compared to traditional ensemble methods. The proposed method shows particular effectiveness in high-dimensional settings and maintains robust performance across varying imbalance ratios from 1:10 to 1:100.
Downloads: 0
Abstract Views: 2134
Rank: 290353