• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2023, Vol. 45 ›› Issue (02): 191-203.

• High Performance Computing • Previous Articles     Next Articles

Review of one-shot neural architecture search

DONG Pei-jie,NIU Xin,WEI Zi-mian,CHEN Xue-hui   

  1.  (College of Computer Science and Technology,National University of Defense Technology,Changsha 410073,China) 
  • Received:2021-11-29 Revised:2022-05-26 Accepted:2023-02-25 Online:2023-02-25 Published:2023-02-15

Abstract: The rapid development of deep learning is closely related to the innovation of neural network structure. To improve the efficiency of network architecture design, Neural Architecture Search (NAS), an automated network architecture design method, has become a research hotspot in recent years. Earlier neural architecture search algorithms in iterative search usually have to train and evaluate a large number of sampled candidate networks, which brings huge computational overhead. Through transfer learning, the convergence of candidate network can be accelerated, thus improving the efficiency of neural architecture search. One-shot NAS based on weight transfer technique is based on super graph, and weights are shared among sub graphs, which improves the search efficiency, but it also faces challenging problems such as co-adaptation and ranking disorder. Firstly, we introduce the research related to one-shot NAS based on weight-sharing, and then analyze the key technologies from three aspects of sampling strategy, process decoupling and phase, compare and analyze the search effect of typical one-shot neural architecture search algorithms, and provide an outlook on the future research direction.


Key words: neural architecture search(NAS);one-shot NAS, weight-sharing;transfer learning;deep learning