Posted: Jul 03, 2024
This paper presents a novel neural architecture search (NAS) framework that optimizes convolutional neural networks for both accuracy and computational efficiency. Traditional NAS methods often prioritize accuracy at the expense of computational requirements, making them impractical for resource-constrained environments. Our approach employs a multi-objective evolutionary algorithm that simultaneously optimizes network performance and computational complexity. We introduce a hierarchical search space that enables efficient exploration of architectural variations while maintaining structural coherence. Experimental results on CIFAR-10 and ImageNet datasets demonstrate that our method achieves competitive accuracy with significantly reduced computational requirements compared to hand-designed architectures and existing NAS approaches. The proposed framework reduces floating-point operations by 35-45% while maintaining within 1-2% accuracy of state-of-the-art models, making it particularly suitable for deployment on edge devices and mobile platforms.
Downloads: 1035
Abstract Views: 1497
Rank: 42361