Computer Engineering & Science ›› 2024, Vol. 46 ›› Issue (08): 1361-1371.
• High Performance Computing • Previous Articles Next Articles
XIN Gao-feng,LIU Yu-xiao,ZHANG Qing-long,HAN Rui,LIU Chi
Received:
Revised:
Accepted:
Online:
Published:
Abstract: Running deep neural networks on edge devices faces two challenges: model scaling and domain adaptation. Existing model scaling techniques and unsupervised online domain adaptation techniques suffer from coarse scaling granularity, limited scaling space, and long online domain adaptation time. To address these two challenges, this paper proposes a block-grained model scaling and domain adaptation training method called EdgeScaler, which consists of offline and online phases. For the model scaling challenge, in the offline phase, blocks are detected and extracted from various DNN and then are converted into multiple derived blocks. In the online phase, based on the combination of blocks and the connections between them, a large-scale scaling space is provided to solve the model scaling problem. For the domain adaptation challenge, a block-specific residual Adapter is designed, which is inserted into the blocks in the offline phase. In the online phase, when a new target domain arrives, all adapters are trained to solve the domain adaptation problem for all options in the block-grained scaling space. Test results on the real edge device, Jetson TX2, show that EdgeScaler can reduce the domain adaptation training time by an average of 85.14% and reduce the training energy consumption by an average of 84.1%, while providing a large-scale scaling option.
Key words: deep neural network, edge device, elastic scaling, block, domain adaptation
XIN Gao-feng, LIU Yu-xiao, ZHANG Qing-long, HAN Rui, LIU Chi. Block-grained domain adaptation for neural networks at edge[J]. Computer Engineering & Science, 2024, 46(08): 1361-1371.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2024/V46/I08/1361