Close

Cant find what your looking for?

on-demand / Webinars / Satcom

Webinar: Understanding 5G NB-IoT NTN System Performance

Matching satellite configurations to 5G NB-IoT capacity needs

Each satellite network has a different system set-up. It can be a challenge to figure out how this could be paired with 5G NB-IoT NTN, to achieve proper network performance and link conditions. 

In this webinar, we shed some light on common questions regarding the actual link level as well as system level performance, obtainable capacity, throughput, latency, number of devices, antenna design, and coverage, which need to be answered by satellite operators and service providers. We discuss how to get answers to all those questions, considering topics such as individual hardware platform details, fading models. 

Enjoy this free webinar in which Rene Brandborg Sørensen and Juline Hilsch discuss: 

  • How 5G NB-IoT performs in individual satellite systems 
  • How to investigate trade-offs in NB-IoT performance and satellite configurations 
  • Concrete simulation examples for a non-GEO and GEO system in comparison 

Sign up now and watch at your convenience:

Watch now

Dive into the webinar Q&A

How do you know that your models are performing / simulating “real-life-scenarios”?

In esssence, the realism of the feasibility study depends on the configuration of the scenario (input parameters) and the results are generally approximation, worst/best-case results and where applicable they have been compared to similar SoTA results. All modelling is an attempt to deconstruct or approximate reality in a way that we can more easily deal with. In our feasibility-study we have divided the RAN (radio access network) in three major parts the fading channel, the link-level and the system level. We can develop fading channels based on Ray-tracing, which will be very realistic or use a more abstract/generalised model – or 3GPP standardized models depending on choice. On the Link-level we do extensive monte-carlo simulations to find the link performance given the chosen fading model. On the system elvel we have rigorous analytical models, which account for many protocol aspects and signalling overheads (e.g. the various message sequences) – and this level relies on the realism of the two layers below.


Which use cases for 5G NB-IoT would work best for GEO vs non-GEO sats?

The use-case would be delay-tolerant applications for both LEO and GEO and GEO has the advantage of providing terrestrial-like cells while LEO has the advantage of providing global (discontinous) coverage and a lower propagation delay. It is cheaper to launch satellites into LEO than GEO, so typically a GEO payload can be more expensive and justify an inccreased power budget compared to LEO satellite payloads. The new space-race with cube-sats is especially allowing for low-cost LEO payloads to be launched.


On what frequency band this model is derived?

The carrier frequency (band of operation) is a parameter for the configuration of the feasibility study. In general the frequency will change the link budget and the Doppler characterstics.


Can this simulator be used by Cellular operators to find out which Satellite is giving good coverage and capacity in a geography?

The feasibility study allows for ascertaining system level KPIs (Sytem capacity, UE QoS (Throughput, latency) and UE energy consumption. This is done on the basis of the scenario definition – so it is indeed possible to define a specific geographic area, say the Himalayas and ascertain the performance of a Cell or a UE in that location.


What bandwidth can be reached (in bits per second) ?

The peak throughput is a bit less than for terrestrial NB-IoT around 258 kbits/s in PDSCH(DL) and the same in PUSCH(UL) at the link-level without accounting for propagation time. In reality the optainable throughput will depend heavily on the link-budget throughout the cell and this is a function of the satellite payload. In our feasibility study we can take this evaluation one step further to account for overhead in terms of static signalling and the dynamic message exchanges (an application paylaod is embedded in a larger message exchange, eg. RA+)


Does the beam center move with the satellite movement in NGSO or does it “track” the location of the NB-IOT devices in FOV?

There are two scenarios defined by 3GPP in the NGSO case: 1) Earth-fixed cells, where an NGSO satellite steers its beams such that the cell projected on the ground does not move and 2) Earth-moving cells, where an NGSO satellite has a fixed beam direction, such that the cell moves around with the satellite.


Why is this based on 5G ? Is there a technical limitation that prevented this NB-IoT to work with 4G standards ?

5G is a set of requirements for networks – as was 4G. In 5G one of the targeted use-cases is massive machine type communications (mMTC). The requirement for a 5G mMTC technology is that it must be able to service 1 million devices per km2 sending 32bytes of L2 data every 2 hr. After the requirements had been set, the development of the new technologies for 5G started. It was quickly found that NB-IoT and eMTC were sufficient for this requirement (terrestrially) given enough channels. Therefore these radio access networks are 5G compliant and hence now called 5G. In the backbone of the network there is a core network, here the 5G variant is called 5GC (5G core) and the 4G variant is called EPC (evolved apcket core). Even though the RAN remains largely the same (but has developed over the 3GPP releases) there are some differences in base-station depending on whether it is interfacing with 5GC or EPC.


It was mentioned of 15+ waveforms from Gatehouse. I wonder if there are examples and more details. Thanks.

We have developed waveforms ranging from GMR-1 to DAMA protocols, to Inmarsat BGAN, to 5G NB-IoT – for military purposes as well as commercial services. If you´d like more details on a specific waveform, just let us know.


Are you recruiting folks from Telecom domain or only from SatCom?

Yes, we recruit from telecom as well. We currently have several open job listings on our website. The engineering and management teams at GateHouse are very diverse, not only having extensive experience in the domains of telecom and satcom, but they are also industry leaders in specific technical areas, such as eNodeB, gNodeB, waveforms and system architecture within the non-terrestrial network technology area. The majority of the team holds a Master´s Degree, while other colleagues have a PHD background specialized in telecom or satcom.


Thanks for the presentation. Can I ask whether you have done any study concerning potential interference between terrestrial component and NTN component within the same network?

We have not studied interference between TN and NTN. The networks should be seperated in frequency with appropriate guard bands handling Doppler shift in the NGSO case. The bands and channels allocated for NTN and TN are being etermined by standardization organisations like ITU, 3GPP and ETSI. As a general rule you can count on interference not being allowed.


Is it possible to emulate 5G NB-IoT network links?

Yes, real life testing w. in-orbit emulator.


How is system coping w. finding satellites when both devices and sats are moving?

3GPP has defined functionality wrt. the channel raster such that UEs will always be able to look for, find and appropriately identify any available channel. The trick is to find an available cell by searching for that particular channel while in coverage of a serving satellite. This can be helped by satellite assistance information, which is a feature that is expected to be settled and included n Rel-17.


What has GHS / 3GPP done to minimize signalling overhead?

NB-IoT is a LPWAN, that is a low power wide area networks, such protocols are optimized for long-range transmissions of small data packets. Thus NB-IoT already has comparatively little signalling overhead compared to other protocols (which is why the feature set is also minimized). Further, GH is implementing DoNAS in it’s waveform and it is already implemented in the analysis.


How are your simulations handling dynamics of moving sats?

In the case of a GEO sat the cell will have a static link-budget and the elevation angle toward the satellite does not vary throughout the cell. In the case of NGSO earth-fixed cell, the cell has a fixed position and so would a stationary UE within it, but the link-budget and elevation angles are dynamic and change overt time, so we compute these for a satellite pass. In case of a earth-moving cell NGSO we have a cell which moves within the cell the linkbudget and elevation angles are static, but the cell moves over the UE. This is equivalent to a UE travelling within a GEU cell (at approximately 7.3km/s or so 🙂 )


Does 5G NB-IoT work on Ka/Ku band?

Rel-17 will work on the S-band, but preliminary work has already been started on the Ka-band. It is likely that higher bands will be supported in future releases. The higher frequencies are a source of wider spectrum/bandwidth for the NTN networks, but there are major challenges involved with higher frequencies – in particular dealing with the increased signal propagation. It could very well be unfeasible to launch ka/Ku band on cubesat payloads due to the limited power budget.


Apart from UEs and satlinks, is there any need for ground infrastructure to establish 5G IoT communication?

Indeed, the radio access network (RAN) NB-IoT, LTE, LoRaWAN, etc. are just the communication link between UEs and satellites. To make this link useful a link to the core-network on earth should be established. This latter link is known as the feeder link in SatCom terminology and is established between the satellite and large ground-stations. The service link must provide sufficient capacity for the cummulative RAN information (and then some other telemretry) to be exchanged which is why ground-station typically have large steerable antennas and a large transmission power.


What are considerations for latency for IoT use case?

The latency in NTN is larger than in terrestrial networks due to the larger propagation delay. In some satellite constellations coverage can not be provided continously on the ground either. So IoT devices for NTN must be delay tolerant.


Has GateHouse completed any live OTA trials of NTN NB-IoT in LEO/GEO?

GateHouse is developing a waveform for NB-IoT and we have succesfully sent NB-IoT synchronisation signals on a 800MHz carrier to a GEO satellite from the ground back in 2021.


Are there any satellite crosslink capabilities providing global Satcom coverage vice just connectivity within one satellite footprint?

Yes, inter-satellite links (ISL) can be used for networking and routing between satellites. However in Rel-17 the focus has been on bent-pipe satellite payloads, ie. satellites that act as relas where the ground-station is the actual base-station – so first the focus in a future release needs to switch to regenerative payloads ie. base-stations onboard the satellite – and then to ISL later. Nothing hinders ISL at them moment – it is just not standardized.


What are the typical messages lengths (in kilobytes) that can be sent and received via satellite NB-IoT? Does it compare with cellular NB-IoT?

The transport block sizes in NTN NB-IoT are the same as in NB-IoT so the difference is in the fading model and the link budget. Provided that the link budget of a satellite payload is comparable to that of a satellite cell the typical message lengths will be comparable between TN and NTN. Basically, you should in most cases be able to expect TN-like performance if the satellite payload is well designed.


How does the signaling overhead compare for the satellite assistance SIB in LEO vs GEO configurations?

In short, GEO will have little overhead while NGSO and especially LEO will see more overhad, but we expect at most a few percent overhead on the anchor channel.Two SIBs are defined for NTN IoT, the first being for uplink synchronisation and the second (to be defined in May) is for helping UEs to predict coverage in discontinous coverage scenarios, to better enable mobile originating (MO)-traffic. The fist SIB has a fixed size regardless of the use-case, but in LEO it may be necessary to transmit for example once per second (but this will depend on the Orbit, satellite payload GNSS and the band of interest) where in GEO a UE need only receive it once. Overall this SIB should at most take up a few percent of the anchor channel. The SIB for satellite assistance information (SAI) is not defined yet, but we expect it to be of a variable size with plenty of optional parameters. This SIB-SAI is optional and should not be an overhead in GEO. SIB SAI should be expected as overhead in discontinous NGSO only. The SIB SAI need only be received by UEs once, but the overhead here will again be larger for LEO where the satellite will move faster – a rate of once per 5 or 10 sec should be feasible.


Do you apply Beamforming in reception? if yes how do you keep beams inthe rigth direction elevation and azimuth

In NTN IoT the goal is to reuse the hardware platforms of terrestrial cellular. So the UEs are essentially similar to handheld devices with an omnidirectional antenna. Beamforming may be applied from the satellite site to orient the beam towards a specific geolocation for the “earth-fixed cell” scenario.


What are the currently considered strategies for dealing with the Doppler drift in the NGSO setting ? Are UE supposed to precompensate the Doppler ?

Yes, in Rel-17 the UE will handle the compensation. In the downlink the UE will synchronize to the Doppler shifted NPSS/NSSS signals as usual, it will then decode an ephemeris (a description of the serving satellite’s orbit accurate for a moment, say 1 sec) which will allow the UE to precompensate for the Doppler effect when it transmits in the uplink direction (RACH/PUSCH). This will be the way for NTN IoT (NB-IoT and eMTC) and also NTN NR.


Is there a difference between cell size and beam size?

Yes, a “beam” refers to the RF or ‘physical’ power from the TX side, which is a continuous function. A “cell” is a logical entity on the RX side in a cellular network and is determined as an area within the “beam” where certain criteria are met: Synchronisation and SNR above threshold.


Can Ray-tracing be done for different areas, mountains, ocean, desert?

Yes, the example given in the presentation is arbitrary. Ray tracing can be done for different terrain and geographic locations. We can also “make-up” specific scenarios or use 3GPP fading models.


Are there standardized models for fading simulation made by 3GPP as well?

Yes, 3GPP has standardised CDL and TDL fading models for NTN based on the “IST winner II” model.