An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer

Citation:

ZERDOUMI Z, BENMEDDOUR F, ABDOU L, Benatia D. An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer. The Eurasia Proceedings of Science Technology Engineering and MathematicsThe Eurasia Proceedings of Science Technology Engineering and Mathematics [Internet]. 2021;14 :1-7.

Date Published:

2021

Abstract:

Feed for word neural networks (FFNN) have attracted a great attention, in digital communication area. Especially they are investigated as nonlinear equalizers at the receiver, to mitigate channel distortions and additive noise. The major drawback of the FFNN is their extensive training. We present a new approach to enhance their training efficiency by adapting the activation function. Adapting procedure for activation function extensively increases the flexibility and the nonlinear approximation capability of FFNN. Consequently, the learning process presents better performances, offers more flexibility and enhances nonlinear capability of NN structure thus the final state kept away from undesired saturation regions. The effectiveness of the proposed method is demonstrated through different challenging channel models, it performs quite well for nonlinear channels which are severe and hard to equalize. The performance is measured throughout, convergence properties, minimum bit error achieved. The proposed algorithm was found to converge rapidly, and accomplish the minimum steady state value. All simulation shows that the proposed method improves significantly the training efficiency of FFNN based equalizer compared to the standard training one.

Publisher's Version

Last updated on 09/18/2023