Cell Search Procedure in WCDMA

Cell Search procecess in WCDMA would be described as follows (For the detailed understanding, I would recommend you to study each of the physical channels involved in the following description).

i) Every cell is tranmitting its scrambling code(Primary Scrambling Code) via CPICH.
ii) UE detect the cell power, primary scrambling code and some addition info for compensating demodulation process
iii) UE detect P-SCH (Primary Synchronization Code) and figure out slot boundary (start and end of each slot)
iv) UE detect S-SCH (Primary Synchronization Code) and figure out frame bounday (start and end of each frame)
v) UE detect P-CCPCH and decode MIB. Through this MIB, UE can figure out SFN.


CQI

CQI stands for Channel Quality Indicator. As the name implies, it is an indicator carrying the information on how good/bad the communication channel quality is. This CQI is for HSDPA. (LTE also has CQI for its own purpose).

CQI is the information that UE sends to the network and practically it implies the following two
i) Current Communication Channel Quality is this-and-that..
ii) I (UE) wants to get the data with this-and-that transport block size, which in turn can be directly converted into throughput

In HSDPA, the CQI value ranges from 0 ~ 30. 30 indicates the best channel quality and 0,1 indicates the poorest channel quality. Depending which value UE reports, network transmit data with different transport block size. If network gets high CQI value from UE, it transmit the data with larger transport block size and vice versa.

What if network sends a large transport block even though UE reports low CQI, it is highly probable that UE failed to decode it (cause CRC error on UE side) and UE send NACK to network and the network have to retransmit it which in turn cause waste of radio resources.

What if UE report high CQI even when the real channel quality is poor ? In this case, network would send a large transport block size according to the CQI value and it would become highly probable that UE failed to decode it (cause CRC error on UE side) and UE send NACK to network and the network have to retransmit it which in turn cause waste of radio resources.

How UE can measure CQI ? This is the most unclear topic to me. As far as I know, there is no explicit description in any standard on the mechanism by which the CQI is calculated, but it is pretty obvious that the following factors play important roles to CQI measurement.
  • signal-to-noise ratio (SNR)
  • signal-to-interference plus noise ratio (SINR)
  • signal-to-noise plus distortion ratio (SNDR)
It is unclear how these factors are used and whether there is any other factors being involved. I was told the detailed CQI measurement algorithm is up UE implementation (chipset implementation).


FDD in LTE



Overview - FDD

The highest level view from 36.211 for FDD LTE is as follows. It only shows the structure of one frame in time domain. It does not show any structure in frequency domain.
Some of high level description you can get from this figure would be
i) Time duration for one frame (One radio frame, One system frame) is 10 ms. This means that we have 100 radio frame per second.
ii) the number of samples in one frame (10 ms) is 307200 (307.200 K) samples. This means that the number of samples per second is 307200 x 100 = 30.72 M samples.
iii) Number of subframe in one frame is 10.

iv) Number of slots in one subframe is 2. This means that we have 20 slots within one frame.


So one slot is the smallest structure in time domain ? No, if you magnify this frame structure one step further, you would get the following figure.
Now you see that one slot is made up of 7 small blocks called 'symbol'. (One symbol is a certain time span of signal that carry one spot in the I/Q constellation.).
And you see even smaller structures within a symbol. At the beginning of symbol you see a very small span called 'Cyclic Prefix' and the remaining part is the real symbol data.
There are two different type of Cyclic Prefix. One is normal Cyclic Prefix and the other is 'Extended Cyclic Prefix' which is longer than the Normal Cyclic Prefix. (Since the length of one slot is fixed and cannot be changed, if we use 'Extended Cyclic Prefix', the number of symbols that can be accomodated within a slot should be decreased. So we can have only 6 symbols if we use 'Extended Cyclic Prefix').


Automatic Neighbor Relation (ANR) in LTE


Manually adding neighbor cells in network is indeed a very hectic process in GSM & WCDMA Network. While  networks are becoming more and more complex, it is required to find an automatic and a more optimized way of adding neighbor cells.

ANR comes under the umbrella of Self Organizing Networks ( SON) features. ANR relies on UE to detect unknown cells and report them to eNB. There are two major types:

i) UE based ANR
ii) ANR with OAM Support





UE based ANR
·                     No OAM support is required.
·                     UE detects PCI of unknown cell when it needs to do measurement (as configured by network)
·                     In case of inter-frequency or inter-RAT measurements, eNB needs to configure measurement gaps/or DRX so UE can detect PCI to different frequencies as well.
·                     UE reports the unknown PCI to eNB via RRC-Reconfiguration message.
·                     eNB request UE to report Eutran Cell Global ID (ECGI).
·                     UE reports ECGI by reading BCCH channel.
·                     eNB retrieves the IP address from MME to further setup the x2 interface. 





ANR with OAM Support
·                     OAM support is required
·                     Every new eNB registers to OAM and download the table with information of PCI/ECGI/IP related to neighbors
·                      Neighbors also update their own table with new eNB information
·                     Now like "UE based ANR", UE will detect unknown PCI and report it to the eNB
·                     eNB doesn't request for ECGI and does not need support from MME
·                     eNB setups x2 interface with the help of mapping table created in second step above





SC-FDMA


The LTE uplink transmission scheme for FDD and TDD mode is based on SC-FDMA (Single Carrier Frequency Division Multiple Access).
This is to compensate for a drawback with normal OFDM, which has a very high Peak to Average Power Ratio (PAPR). High PAPR requires expensive and inefficient power amplifiers with high requirements on linearity, which increases the cost of the terminal and also drains the battery faster.
SC-FDMA solves this problem by grouping together the resource blocks in such a way that reduces the need for linearity, and so power consumption, in the power amplifier. A low PAPR also improves coverage and the cell-edge performance.

Still, SC-FDMA signal processing has some similarities with OFDMA signal processing, so parameterization of downlink and uplink can be harmonized.

OFDMA


OFDMA : Orthogonal Frequency division multiple access.
LTE uses OFDM for the downlink –that is, from the base station to the terminal. OFDM meets the LTE requirement for spectrum flexibility and enables cost-efficient solutions for very wide carriers with high peak rates. OFDM uses a large number of narrow sub-carriers for multi-carrier transmission.
The basic LTE downlink physical resource can be seen as a time-frequency grid. In the frequency domain, the spacing between the subcarriers, Δf, is 15kHz. In addition, the OFDM symbol duration time is 1/Δf + cyclic prefix. The cyclic prefix is used to maintain orthogonality between the sub-carriers even for a time-dispersive radio channel.
One resource element carries QPSK, 16QAM or 64QAM. With 64QAM, each resource element carries six bits.
The OFDM symbols are grouped into resource blocks. The resource blocks have a total size of 180kHz in the frequency domain and 0.5ms in the time domain. Each 1ms Transmission Time Interval (TTI) consists of two slots (Tslot).
In E-UTRA, downlink modulation schemes QPSK, 16QAM, and 64QAM are available

System Architecture Evolution(SAE)



System Architecture Evolution (aka SAE) is the core network architecture of 3GPP's future LTE wireless communication standard.
SAE is the evolution of the GPRS Core Network, with some differences.
The main principles and objectives of the LTE-SAE architecture include :
A common anchor point and gateway (GW) node for all access technologies
IP-based protocols on all interfaces;
Simplified network architecture
All IP network
All services are via Packet Switched domain
Support mobility between heterogeneous RATs, including legacy systems as GPRS, but also non-3GPP systems (say WiMAX)

Support for multiple, heterogeneous RATs, including legacy systems as GPRS, but also non-3GPP systems (say WiMAX)