Do you think that in DL Throughput, should RTT be considered as an affecting criteria or not?
Especially from Core perspective (not RAN)?
Some vendors start the timer from first download byte not from client request to last data, but from first data to last data?
What do you think?
Throughput <=RWND / RTT
rwnd - receive window for TCP, advertised/offered by the receiver
So the RTT, or experienced latency, plays an important role in the achievable throughput rate.
High bandwidth alone does not guarantee high throughput rates!