Training size optimization with reduced complexity in cell-free massive MIMO system

Document Type

Article

Publication Date

1-1-2019

Abstract

Training sequence is used in multiple antenna systems to estimate channel state information and mitigate channel distortion between transmitter and receiver. However, the training sequence or pilot must be limited to a certain size in order to reduce the impact of overhead loss due to limited channel coherence length in mobile users. In this paper, we proposed to use training size optimization in cell-free massive MIMO system. In addition, we proposed and compared the performance of different training size optimization algorithms, namely exhaustive search optimization, bisection optimization and min–max optimization, with each method has different level of calculation complexities. The results showed that in general, all of the 3 training length optimization methods improved the downlink rate compared to the conventional pilot length method. We also showed that the training optimization methods are more effective when the coherence length is small or the number of users is very large. In the case of large number of users or small coherence length, the exhaustive search has the best median downlink rate, followed closely by min–max optimum and finally the bisection method. Even though the exhaustive search optimization has the best downlink rate, we showed that the proposed reduce optimization complexity methods has significantly less calculation complexity. In addition, the median downlink rate performance of min–max optimization method is only slightly less than that of the exhaustive search method for various number of users and coherence length. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.

Keywords

Channel training, Massive MIMO, Multiple-input multiple output

Divisions

fac_eng

Funders

University Malaya Research Fund Assistance (BKP) (Grand No. BK051-2016)

Publication Title

Wireless Networks

Volume

25

Issue

4

Publisher

Springer Verlag

This document is currently not available here.

Share

COinS