节点文献
LTE上行MU-MIMO信道估计技术研究
MU-MIMO Channel Estimation for LTE Uplink
【作者】 杨磊;
【作者基本信息】 上海交通大学 , 电子与通信工程(专业学位), 2015, 硕士
【摘要】 以正交频分复用(OFDM)和多输入多输出(MIMO)为核心的长期演进(LTE)技术被视作3G向4G演进的主流技术。LTE上行采用单载波频分多址(SC-FDMA)方案,支持多用户MIMO(MU-MIMO)场景,因此MU-MIMO下的信道估计技术显得尤为重要。本文以SISO场景下的信道估计为基础,对LTE上行MU-MIMO场景下的信道估计进行了研究。首先构建了基于MATLAB的LTE上行物理层仿真平台。该平台由发射机模块、接收机模块和信道模块组成,支持MU-MIMO应用和多种信道模型。后续研究中在该平台上进行信道估计技术的BLER性能仿真,以验证信道估计技术的计算复杂度和估计准确性。其次对已有的SISO场景下的参考信号设计和信道估计技术进行了分析。参考信号主要分析了Zadoff-Chu序列、循环时间移位、循环移位跳变、DMRS信号插入间距等重要理论概念。信道估计技术主要分析了信道插值估计、最小二乘(LS)估计、变换域估计、最小均方误差(MMSE)估计等常用信道估计方法。最小二乘(LS)估计常常作为联合估计算法的第一步,它的原理和实现步骤比较简单,但是受到噪声的影响较大。插值拟合就是利用已估计出的参考信号RE的CFR值去拟合数据RE的CFR值,不需要利用任何已知的信道信息,常用的插值算法有:线性插值、高斯插值、三次样条插值等。变换域估计方法在LTE中的应用主要有两种:DFT-based变换域估计和DCT-based变换域估计,其中,基于DFT的变换域估计由于可以直接利用OFDM系统中的FFT模块,所以获得了更加广泛的应用。最小均方误差(MMSE)估计利用了信道的二阶统计特性,性能优于上述几种估计算法,但是它的计算复杂度也是最高的,主要包括:MMSE算法、简化MMSE算法和SVD算法等。在充分研究SISO情况下的信道估计算法后,提出了上行MU-MIMO情况下的参考信号设计和信道估计算法的改进。参考信号的设计是对现有参考信号的继承和较小的改动,采用最大间距的循环移位组合,使得在信道估计时不同循环移位间的串扰最小,满足对低版本LTE的兼容和MU-MIMO的应用。信道估计算法是在基于DFT的变换域估计的基础上所作的多种算法的联合估计,为了减少CIR能量泄露和频域Gibbs现象导致的MSE的上升,利用动态噪声门限和频域窗函数来提高性能。利用前述的仿真平台,对MU-MIMO情况下使用动态噪声门限和频域窗函数的基于DFT的变换域估计算法进行了仿真。由仿真曲线可知,改进后的基于DFT的变换域估计算法实现了预期的功能,同时,使用动态噪声门限和频域窗函数技术可以有效提高信道估计的准确性。
【Abstract】 Long term evolution(LTE) technology, based on orthogonal frequency division multiplexing(OFDM)and multiple input multiple output(MIMO), is regarded as the most popular technology in evolution from 3G to 4G. Since LTE Uplink, using single carrier frequency division multiple access(SC-FDMA) scheme, supports multi user MIMO(MU-MIMO) scenario, channel estimation for MU-MIMO is very important. In this paper, based on channel estimation under SISO scenario, channel estimation under MU-MIMO scenario for LTE uplink is studied.Firstly, LTE uplink physical layer simulation platform based on MATLAB is designed. The platform consists of transmitter part, receiver part and channel part, support MU-MIMO scenario and multiple channel models. This platform is used for verification of computation complexity and estimation accuracy for channel estimation, through BLER performance simulation.Secondly, reference signal design and channel estimation technology for SISO scenario are analyzed. The analysis of reference signal consists of introduction of Zadoff-Chu sequence, cyclic time shift, cyclic shift hopping and DMRS pattern. The study of channel estimation technology focus on interpolation estimation, least squares(LS) estimation, transform domain estimation and minimum mean square error(MMSE) estimation. Least squares(LS) estimation is often used as the first step of the joint estimation algorithm, as its principle and realization are relatively simple, but it is affected by the noise a lot. Interpolation estimation, using CFR of reference signal REs to calculate CFR of data REs, needing no other known channel information, consists of linear interpolation estimation, Gauss interpolation estimation and so on. There are two kinds of basic transform domain estimation method used in LTE, which are DFT-based transform domain estimation and DCT-based transform domain estimation. Since DFT-based transform domain estimation can directly use the FFT module in OFDM system, it is used more widely. Because minimum mean square error(MMSE) estimation uses the two order statistic properties of the channel, its performance is better than other estimation methods, whereas its computation complexity is the highest. There are three common MMSE algorithms, which are MMSE algorithm, simplified MMSE algorithm and SVD algorithm.Based on channel estimation algorithm for SISO scenario, reference signal design and channel estimation algorithm for LTE uplink MU-MIMO scenario are investigated. An improvement of existing reference signal is introduced, which uses cyclic shift with maximum distance to decrease the interference between different cyclic shift. It can meet requirement of application of MU-MIMO as well as be compatible to old release. Channel estimation algorithm is a joint estimation algorithm based on DFT-based transform domain estimation, using dynamic noise threshold and frequency domain window function to decrease MSE introduced by CIR energy leakage and frequency domain Gibbs phenomenon. By using the LTE uplink physical layer simulation platform based on MATLAB, simulation result of above improved estimation algorithm is compared with basic DFT-based transform domain estimation algorithm. The simulation curve shows, the improved DFT-based transform domain estimation algorithm has realized its expected function, and the using of dynamic noise threshold and frequency domain window function can increase the accuracy of channel estimation.
【Key words】 LTE; Channel Estimation; MU-MIMO; Reference Signal; Least Squares(LS) Estimation; Interpolation Estimation; Transform Domain Estimation;