In esssence, the realism of the feasibility study depends on the configuration of the scenario (input parameters) and the results are generally approximation, worst/best-case results and where applicable they have been compared to similar SoTA results. All modelling is an attempt to deconstruct or approximate reality in a way that we can more easily deal with. In our feasibility-study we have divided the RAN (radio access network) in three major parts the fading channel, the link-level and the system level. We can develop fading channels based on Ray-tracing, which will be very realistic or use a more abstract/generalised model – or 3GPP standardized models depending on choice. On the Link-level we do extensive monte-carlo simulations to find the link performance given the chosen fading model. On the system elvel we have rigorous analytical models, which account for many protocol aspects and signalling overheads (e.g. the various message sequences) – and this level relies on the realism of the two layers below.
The use-case would be delay-tolerant applications for both LEO and GEO and GEO has the advantage of providing terrestrial-like cells while LEO has the advantage of providing global (discontinous) coverage and a lower propagation delay. It is cheaper to launch satellites into LEO than GEO, so typically a GEO payload can be more expensive and justify an inccreased power budget compared to LEO satellite payloads. The new space-race with cube-sats is especially allowing for low-cost LEO payloads to be launched.
The carrier frequency (band of operation) is a parameter for the configuration of the feasibility study. In general the frequency will change the link budget and the Doppler characterstics.
The feasibility study allows for ascertaining system level KPIs (Sytem capacity, UE QoS (Throughput, latency) and UE energy consumption. This is done on the basis of the scenario definition – so it is indeed possible to define a specific geographic area, say the Himalayas and ascertain the performance of a Cell or a UE in that location.
The peak throughput is a bit less than for terrestrial NB-IoT around 258 kbits/s in PDSCH(DL) and the same in PUSCH(UL) at the link-level without accounting for propagation time. In reality the optainable throughput will depend heavily on the link-budget throughout the cell and this is a function of the satellite payload. In our feasibility study we can take this evaluation one step further to account for overhead in terms of static signalling and the dynamic message exchanges (an application paylaod is embedded in a larger message exchange, eg. RA+)
There are two scenarios defined by 3GPP in the NGSO case: 1) Earth-fixed cells, where an NGSO satellite steers its beams such that the cell projected on the ground does not move and 2) Earth-moving cells, where an NGSO satellite has a fixed beam direction, such that the cell moves around with the satellite.
5G is a set of requirements for networks – as was 4G. In 5G one of the targeted use-cases is massive machine type communications (mMTC). The requirement for a 5G mMTC technology is that it must be able to service 1 million devices per km2 sending 32bytes of L2 data every 2 hr. After the requirements had been set, the development of the new technologies for 5G started. It was quickly found that NB-IoT and eMTC were sufficient for this requirement (terrestrially) given enough channels. Therefore these radio access networks are 5G compliant and hence now called 5G. In the backbone of the network there is a core network, here the 5G variant is called 5GC (5G core) and the 4G variant is called EPC (evolved apcket core). Even though the RAN remains largely the same (but has developed over the 3GPP releases) there are some differences in base-station depending on whether it is interfacing with 5GC or EPC.
We have developed waveforms ranging from GMR-1 to DAMA protocols, to Inmarsat BGAN, to 5G NB-IoT – for military purposes as well as commercial services. If you´d like more details on a specific waveform, just let us know.
Yes, we recruit from telecom as well. We currently have several open job listings on our website. The engineering and management teams at GateHouse are very diverse, not only having extensive experience in the domains of telecom and satcom, but they are also industry leaders in specific technical areas, such as eNodeB, gNodeB, waveforms and system architecture within the non-terrestrial network technology area. The majority of the team holds a Master´s Degree, while other colleagues have a PHD background specialized in telecom or satcom.
We have not studied interference between TN and NTN. The networks should be seperated in frequency with appropriate guard bands handling Doppler shift in the NGSO case. The bands and channels allocated for NTN and TN are being etermined by standardization organisations like ITU, 3GPP and ETSI. As a general rule you can count on interference not being allowed.
Yes, real life testing w. in-orbit emulator.
3GPP has defined functionality wrt. the channel raster such that UEs will always be able to look for, find and appropriately identify any available channel. The trick is to find an available cell by searching for that particular channel while in coverage of a serving satellite. This can be helped by satellite assistance information, which is a feature that is expected to be settled and included n Rel-17.
NB-IoT is a LPWAN, that is a low power wide area networks, such protocols are optimized for long-range transmissions of small data packets. Thus NB-IoT already has comparatively little signalling overhead compared to other protocols (which is why the feature set is also minimized). Further, GH is implementing DoNAS in it’s waveform and it is already implemented in the analysis.
In the case of a GEO sat the cell will have a static link-budget and the elevation angle toward the satellite does not vary throughout the cell. In the case of NGSO earth-fixed cell, the cell has a fixed position and so would a stationary UE within it, but the link-budget and elevation angles are dynamic and change overt time, so we compute these for a satellite pass. In case of a earth-moving cell NGSO we have a cell which moves within the cell the linkbudget and elevation angles are static, but the cell moves over the UE. This is equivalent to a UE travelling within a GEU cell (at approximately 7.3km/s or so 🙂 )
Rel-17 will work on the S-band, but preliminary work has already been started on the Ka-band. It is likely that higher bands will be supported in future releases. The higher frequencies are a source of wider spectrum/bandwidth for the NTN networks, but there are major challenges involved with higher frequencies – in particular dealing with the increased signal propagation. It could very well be unfeasible to launch ka/Ku band on cubesat payloads due to the limited power budget.
Indeed, the radio access network (RAN) NB-IoT, LTE, LoRaWAN, etc. are just the communication link between UEs and satellites. To make this link useful a link to the core-network on earth should be established. This latter link is known as the feeder link in SatCom terminology and is established between the satellite and large ground-stations. The service link must provide sufficient capacity for the cummulative RAN information (and then some other telemretry) to be exchanged which is why ground-station typically have large steerable antennas and a large transmission power.
The latency in NTN is larger than in terrestrial networks due to the larger propagation delay. In some satellite constellations coverage can not be provided continously on the ground either. So IoT devices for NTN must be delay tolerant.
GateHouse is developing a waveform for NB-IoT and we have succesfully sent NB-IoT synchronisation signals on a 800MHz carrier to a GEO satellite from the ground back in 2021.
Yes, inter-satellite links (ISL) can be used for networking and routing between satellites. However in Rel-17 the focus has been on bent-pipe satellite payloads, ie. satellites that act as relas where the ground-station is the actual base-station – so first the focus in a future release needs to switch to regenerative payloads ie. base-stations onboard the satellite – and then to ISL later. Nothing hinders ISL at them moment – it is just not standardized.
The transport block sizes in NTN NB-IoT are the same as in NB-IoT so the difference is in the fading model and the link budget. Provided that the link budget of a satellite payload is comparable to that of a satellite cell the typical message lengths will be comparable between TN and NTN. Basically, you should in most cases be able to expect TN-like performance if the satellite payload is well designed.
In short, GEO will have little overhead while NGSO and especially LEO will see more overhad, but we expect at most a few percent overhead on the anchor channel.Two SIBs are defined for NTN IoT, the first being for uplink synchronisation and the second (to be defined in May) is for helping UEs to predict coverage in discontinous coverage scenarios, to better enable mobile originating (MO)-traffic. The fist SIB has a fixed size regardless of the use-case, but in LEO it may be necessary to transmit for example once per second (but this will depend on the Orbit, satellite payload GNSS and the band of interest) where in GEO a UE need only receive it once. Overall this SIB should at most take up a few percent of the anchor channel. The SIB for satellite assistance information (SAI) is not defined yet, but we expect it to be of a variable size with plenty of optional parameters. This SIB-SAI is optional and should not be an overhead in GEO. SIB SAI should be expected as overhead in discontinous NGSO only. The SIB SAI need only be received by UEs once, but the overhead here will again be larger for LEO where the satellite will move faster – a rate of once per 5 or 10 sec should be feasible.
In NTN IoT the goal is to reuse the hardware platforms of terrestrial cellular. So the UEs are essentially similar to handheld devices with an omnidirectional antenna. Beamforming may be applied from the satellite site to orient the beam towards a specific geolocation for the “earth-fixed cell” scenario.
Yes, in Rel-17 the UE will handle the compensation. In the downlink the UE will synchronize to the Doppler shifted NPSS/NSSS signals as usual, it will then decode an ephemeris (a description of the serving satellite’s orbit accurate for a moment, say 1 sec) which will allow the UE to precompensate for the Doppler effect when it transmits in the uplink direction (RACH/PUSCH). This will be the way for NTN IoT (NB-IoT and eMTC) and also NTN NR.
Yes, a “beam” refers to the RF or ‘physical’ power from the TX side, which is a continuous function. A “cell” is a logical entity on the RX side in a cellular network and is determined as an area within the “beam” where certain criteria are met: Synchronisation and SNR above threshold.
Yes, the example given in the presentation is arbitrary. Ray tracing can be done for different terrain and geographic locations. We can also “make-up” specific scenarios or use 3GPP fading models.
Yes, 3GPP has standardised CDL and TDL fading models for NTN based on the “IST winner II” model.