If we tested throughput for LTE connection vs NSA connection having identical bandwidths (20 MHz each or 3CA 60MHz vs 60 MHz in NSA…etc) and we kept all other factors constant so similar radio conditions, traffic, modulation…etc:
Is NSA going to reflect higher throughput value or it will be identical to that of LTE too?
It’ll be higher on NSA as it will be more spectral efficient.
Simply, each LTE carrier will have 18 MHz of useful bandwidth, 2 MHz are guard intervals, so in total 54 MHz for LTE, while for 5G you’ll have around 58 MHz.
That’s a great highlight @georgest . Is there any reference for the guard band used against BW and SCS selection for NSA?
Also, allow me to deviate a little bit, does that mean we’re assuming data is being transferred 100% on the SCG rather than being splited to MCG as well?
Assuming B1 band:
3CA 60MHz with 20MHz each 2x2MIMO, 256QAM will have max. ~640Mbps
60 MHz in NSA (30Khz) will have max. 1.04Gbps
You can calculate guard band on your own, no need for some document. For LTE technology PRB size is fixed, i.e. 180 kHz, so just by multiplying with number of PRBs for each BW size you can get useful bandwidth and the rest is guard interval. I gave you an example for BW=20 MHz, which has 100 PRBs, so multiplying 100 PRBs x 180 kHz = 18000 kHz (18MHz) and the rest of 2 MHz is guard band. The same approach applies for other BW sizes and also for 5G. But in 5G you have to consider SCS, so if SCS=30 kHz you have to multiply the number of PRBs for particular BW size and SCS with 360 kHz.
For the second question assumption is what would be the peak DL spectral efficency performance for both LTE and NR cases.