TY - JOUR
T1 - Capacity scaling in MIMO systems with general unitarily invariant random matrices
AU - Cakmak, Burak
AU - Muller, Ralf R.
AU - Fleury, Bernard H.
PY - 2018/5/1
Y1 - 2018/5/1
N2 - We investigate the capacity scaling of multiple-input-multiple-output systems with the system dimensions. To that end, we quantify how the mutual information varies when the number of antennas (at either the receiver or transmitter side) is altered. For a system comprising R receive and T transmit antennas with R>T , we find the following: by removing as many receive antennas as needed to obtain a square system (provided the channel matrices before and after the removal have full rank) the maximum resulting loss of mutual information over all signal-to-noise ratios (SNRs) depends only on R , T , and the matrix of left-singular vectors of the initial channel matrix, but not on its singular values. In particular, if the latter matrix is Haar distributed the ergodic rate loss is given by \sum {t=1}^{T}\sum {r=T+1}^{R}\frac {1}{r-t} nats. Under the same assumption, if T,R\to \infty with the ratio \phi \triangleq T/R fixed, the rate loss normalized by R converges almost surely to H(\phi) bits with H(\cdot) denoting the binary entropy function. We also quantify and study how the mutual information as a function of the system dimensions deviates from the traditionally assumed linear growth in the minimum of the system dimensions at high SNR.
AB - We investigate the capacity scaling of multiple-input-multiple-output systems with the system dimensions. To that end, we quantify how the mutual information varies when the number of antennas (at either the receiver or transmitter side) is altered. For a system comprising R receive and T transmit antennas with R>T , we find the following: by removing as many receive antennas as needed to obtain a square system (provided the channel matrices before and after the removal have full rank) the maximum resulting loss of mutual information over all signal-to-noise ratios (SNRs) depends only on R , T , and the matrix of left-singular vectors of the initial channel matrix, but not on its singular values. In particular, if the latter matrix is Haar distributed the ergodic rate loss is given by \sum {t=1}^{T}\sum {r=T+1}^{R}\frac {1}{r-t} nats. Under the same assumption, if T,R\to \infty with the ratio \phi \triangleq T/R fixed, the rate loss normalized by R converges almost surely to H(\phi) bits with H(\cdot) denoting the binary entropy function. We also quantify and study how the mutual information as a function of the system dimensions deviates from the traditionally assumed linear growth in the minimum of the system dimensions at high SNR.
KW - Binary entropy function
KW - Haar random matrix
KW - High SNR
KW - Multiple-input-multiple-output
KW - Multiplexing gain
KW - Mutual information
KW - S-Transform
KW - Unitary invariance
UR - http://www.scopus.com/inward/record.url?scp=85043368197&partnerID=8YFLogxK
U2 - 10.1109/TIT.2018.2812885
DO - 10.1109/TIT.2018.2812885
M3 - Journal article
AN - SCOPUS:85043368197
SN - 0018-9448
VL - 64
SP - 3825
EP - 3841
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 5
ER -