Computer Engineering & Science ›› 2023, Vol. 45 ›› Issue (02): 191-203.
• High Performance Computing • Previous Articles Next Articles
DONG Pei-jie,NIU Xin,WEI Zi-mian,CHEN Xue-hui
Received:
Revised:
Accepted:
Online:
Published:
Abstract: The rapid development of deep learning is closely related to the innovation of neural network structure. To improve the efficiency of network architecture design, Neural Architecture Search (NAS), an automated network architecture design method, has become a research hotspot in recent years. Earlier neural architecture search algorithms in iterative search usually have to train and evaluate a large number of sampled candidate networks, which brings huge computational overhead. Through transfer learning, the convergence of candidate network can be accelerated, thus improving the efficiency of neural architecture search. One-shot NAS based on weight transfer technique is based on super graph, and weights are shared among sub graphs, which improves the search efficiency, but it also faces challenging problems such as co-adaptation and ranking disorder. Firstly, we introduce the research related to one-shot NAS based on weight-sharing, and then analyze the key technologies from three aspects of sampling strategy, process decoupling and phase, compare and analyze the search effect of typical one-shot neural architecture search algorithms, and provide an outlook on the future research direction.
Key words: neural architecture search(NAS);one-shot NAS, weight-sharing;transfer learning;deep learning
DONG Pei-jie, NIU Xin, WEI Zi-mian, CHEN Xue-hui. Review of one-shot neural architecture search[J]. Computer Engineering & Science, 2023, 45(02): 191-203.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2023/V45/I02/191