Ιδρυματικό Αποθετήριο [SANDBOX]
Πολυτεχνείο Κρήτης
EN  |  EL

Αναζήτηση

Πλοήγηση

Ο Χώρος μου

On the training of DS-CDMA neural-network receivers

Matyjas, J.D, Karystinos Georgios, Batalama, S.N.

Πλήρης Εγγραφή


URI: http://purl.tuc.gr/dl/dias/57EB27DF-1E42-4CCF-8B01-0F02F8701AB1
Έτος 2002
Τύπος Πλήρης Δημοσίευση σε Συνέδριο
Άδεια Χρήσης
Λεπτομέρειες
Βιβλιογραφική Αναφορά J. D. Matyjas, G. N. Karystinos, and S. N. Batalama, “On the training of DS-CDMA neural-network receivers,” in Proc. IEEE - International Conference on Acoustics, Speech and Signal Processing,(ICASSP '02) pp. 1017-1020, doi: 10.1109/ICASSP.2002.5743967 https://doi.org/10.1109/ICASSP.2002.5743967
Εμφανίζεται στις Συλλογές

Περίληψη

In this paper we prove formally that the optimum (nonlinear) DS-CDMA single-user decision boundary exhibits the following properties: (i) it is symmetric with respect to the origin and (ii) as it is traversed away from the origin, it converges to a hyperplane parallel to the MF decision boundary. Then, we translate properties (i) and (ii) to a set of constraints that can be used by any optimization algorithm for the selection (training) of the parameters of a general multi-layer-perceptron neural-network receiver. Using these constraints, the number of parameters to be optimized is reduced by nearly 50% for large-size networks, which effectively doubles the speed of any training procedure. Furthermore, we utilize properties (i) and (ii) to develop a new initialization scheme that provides additional improvements on the convergence rate and can be used by any recursive optimization algorithm. As a representative case study we consider the back-propagation (BP) algorithm and develop a constrained version of it that incorporates both the proposed constraints and the proposed initialization. The convergence rate enhancement achieved fay constrained-BP is illustrated by simulations.

Υπηρεσίες

Στατιστικά