The exploding growth of the internet and associated services over
the last decade is fueling the need for ever increasing bandwidth. The number
of intelligent handheld devices is growing exponentially and in turn the demand
for high-speed data services while on the move is increasing tremendously.
Current 3rd Generation (3G) mobile technology is able to cope with the huge
increase in demand to some extent but is not suitable for satisfy the needs
completely. Long Term Evolution (LTE), a whole new “4th Generation” mobile
radio access network (RAN) technology, promises higher data rates—100Mbps in
the downlink and 50Mbps in the uplink in LTE’s first phase—and will reduce the
data and control plane latency with an aim at quenching the insatiable thirst
for high-speed mobile data access. Additionally, LTE is designed to support
interoperability with existing mobile network technologies such as GSM, GPRS and
UMTS. LTE also supports scalable bandwidth, from 1.25MHz to 20MHz, which allows
operators significant deployment flexibility and also can allow for more rapid
roll-out due to spectrum flexibility.
All of these features make LTE a very attractive technology for operators as well as subscribers, and many dozens of operators worldwide have committed to LTE roll-outs in the next two to five years.
All of these features make LTE a very attractive technology for operators as well as subscribers, and many dozens of operators worldwide have committed to LTE roll-outs in the next two to five years.
All is not rosy, however, and the performance demands of LTE
technology is leading to increasing signaling and data requirements which
impose additional demand on the network. In this paper, the need for
and methods of optimizing the Stream Control Transmission Protocol (SCTP) to
handle increased signaling loads in LTE and 3G networks is discussed. READ FURTHER