Submit Your Article

Neural Architecture Search for Efficient Convolutional Networks: A Multi-Objective Optimization Approach

Posted: Oct 28, 2025

Abstract

This paper presents a novel neural architecture search (NAS) framework that optimizes convolutional neural networks for both accuracy and computational efficiency. Traditional NAS methods often prioritize accuracy at the expense of computational requirements, making them impractical for resource-constrained environments. Our approach employs a multi-objective optimization strategy that simultaneously maximizes classification accuracy while minimizing computational complexity. We introduce a hierarchical search space that incorporates depth-wise separable convolutions, bottleneck structures, and attention mechanisms. Experimental results on CIFAR-10 and ImageNet datasets demonstrate that our method discovers architectures that achieve competitive accuracy with state-of-the-art models while requiring 3.2× fewer floating-point operations and 2.8× less memory usage. The proposed framework provides a systematic approach to designing efficient deep learning models suitable for deployment on edge devices and mobile platforms.

Downloads: 0

Abstract Views: 1760

Rank: 275868