TY - JOUR
T1 - Lightweight Automatic Modulation Classification Based on Decentralized Learning
AU - Fu, Xue
AU - Gui, Guan
AU - Wang, Yu
AU - Ohtsuki, Tomoaki
AU - Adebisi, Bamidele
AU - Gacanin, Haris
AU - Adachi, Fumiyuki
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2022/3/1
Y1 - 2022/3/1
N2 - Due to the implementation and performance limitations of centralized learning automatic modulation classification (CentAMC) method, this paper proposes a decentralized learning AMC (DecentAMC) method using model consolidation and lightweight design. Specifically, the model consolidation is realized by a central device (CD) for edge device (ED) model averaging (MA) and multiple EDs for ED model training. The lightweight is designed by separable convolutional neural network (S-CNN), in which the separable convolutional layer is utilized to replace the standard convolution layer and most of fully connected layers are cut off. Simulation results show that the proposed method substantially reduces the storage and computational capacity requirements of the EDs and communication overhead. The training efficiency also shows remarkable improvement. Compared with convolutional neural network (CNN), the space complexity (i.e., model parameters and output feature map) is decreased by about 94% and the time complexity (i.e., floating point operations) of S-CNN is decreased by about 96% while degrading the average correct classification probability by less than 1%. Compared with S-CNN-based CentAMC, without considering model weights uploading and downloading, the training efficiency of our proposed method is about N times of it, where N is the number of EDs. Considering the model weights uploading and downloading, the training efficiency of our proposed method can still be maintained at a high level (e.g., when the number of EDs is 12, the training efficency of the proposed AMC method is about 4 times that of S-CNN-based CentAMC in dataset D1 = 2 {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, 16QAM} and about 5 times that of S-CNN-based CentAMC in dataset D2 = 2 {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, PAM2, PAM4, PAM8, 16QAM}), while the communication overhead is reduced more than 35%.
AB - Due to the implementation and performance limitations of centralized learning automatic modulation classification (CentAMC) method, this paper proposes a decentralized learning AMC (DecentAMC) method using model consolidation and lightweight design. Specifically, the model consolidation is realized by a central device (CD) for edge device (ED) model averaging (MA) and multiple EDs for ED model training. The lightweight is designed by separable convolutional neural network (S-CNN), in which the separable convolutional layer is utilized to replace the standard convolution layer and most of fully connected layers are cut off. Simulation results show that the proposed method substantially reduces the storage and computational capacity requirements of the EDs and communication overhead. The training efficiency also shows remarkable improvement. Compared with convolutional neural network (CNN), the space complexity (i.e., model parameters and output feature map) is decreased by about 94% and the time complexity (i.e., floating point operations) of S-CNN is decreased by about 96% while degrading the average correct classification probability by less than 1%. Compared with S-CNN-based CentAMC, without considering model weights uploading and downloading, the training efficiency of our proposed method is about N times of it, where N is the number of EDs. Considering the model weights uploading and downloading, the training efficiency of our proposed method can still be maintained at a high level (e.g., when the number of EDs is 12, the training efficency of the proposed AMC method is about 4 times that of S-CNN-based CentAMC in dataset D1 = 2 {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, 16QAM} and about 5 times that of S-CNN-based CentAMC in dataset D2 = 2 {FSK, 4FSK, 8FSK, BPSK, QPSK, 8PSK, PAM2, PAM4, PAM8, 16QAM}), while the communication overhead is reduced more than 35%.
KW - Automatic modulation classification (AMC)
KW - centralized learning
KW - convolutional neural network (CNN)
KW - decentralized learning
UR - http://www.scopus.com/inward/record.url?scp=85112168947&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85112168947&partnerID=8YFLogxK
U2 - 10.1109/TCCN.2021.3089178
DO - 10.1109/TCCN.2021.3089178
M3 - Article
AN - SCOPUS:85112168947
VL - 8
SP - 57
EP - 70
JO - IEEE Transactions on Cognitive Communications and Networking
JF - IEEE Transactions on Cognitive Communications and Networking
SN - 2332-7731
IS - 1
ER -