5G mmWave loss compared to LTE low/mid band?

Hello Experts,

I know we have capacity benefits when using 26 GHz mmWave in 5G NR.
But how much is the loss compared to LTE for example in Band 5 (850 MHz) or Band 3 (1800 MHz)?
What is the workaround or rationale here?
Pros, Cons.

For comparison purposes, you can use the much used Friis transmission equation. Basically, the ratio of received power to transmitted power is proportional to directivity of TX antenna, directivity of RX antenna and square of wavelength.

Using prediction tools you will find a difference of ~ 23 dB. This is the reason beamforming need to be used, with narrow beam.

Hello Experts,
Why in mmWave we need to take into account the phase noise in 5G whereas in LTE we weren’t that much care on that factor?

Phase noise impact increases with carrier frequency which is why it’s more harmful to the signal at mmWave while at sub 6 or LTE bands its impact is not that much.

I understand, you mean when I have high phase noise in mmWave then my sinr is poor.
So we need to monitor phase noise?
Phase noise affects the network link(poor link or good link) no?

Thanks alot for your explanation.

Yes, phase noise will behave like any other interference or noise to the clean signal.

I understand, in otherwords phase noise is actually how much ur signal having noise over the link.