After learning from this video and spreadsheet, I decided to try and see how the reference signal values I see with my 4G connection compare. This required a little bit of a reverse engineering, but I think I figured out a few interesting things.
1. I don't know how much power the tower transmits with, but I did randomly see in several places across the internet that some towers transmit at 44dBM per antenna. I used this value, and the results are very similar to what I see in my real life scenarios. (see results below).
2. I don't know the path loss, so I used a online rf signal estimator (towercoverage in this case) to estimate path loss for various frequencies.
In this example, this was my band 2 test with my 15dbi flat panel antennas. It was at the top of the mountain behind my house (for those of you who are aware in my review post on this antennas) and nearly LOS except for about 3-5 trees.
Screen Shot 2018-10-03 at 11.16.59 AM.png
The at!lteinfo command is very useful as it turns out. Below, you can see I have some interference from intra-frequency bands. This high up on my mountain, I was picking up another cell tower somewhere in the area (not sure where) that is broadcasting on the same frequency and interferring with the signal from the tower I'm connecting to. Hence why you see the RSRQ that is more than -10.8. This means the cell tower was fully loaded (at the time of this command being run) and I was getting interference from another cell tower.
Screen Shot 2018-10-03 at 11.17.12 AM.png
And lastly, I modeled this state in the RSRQ speadsheet, and found it to be fairly close to reality.
Screen Shot 2018-10-11 at 1.32.34 PM.png
This is with a path loss of 110 db (for 1900mhz only) and the tower antenna gain of 17dbi, transmitting at 44dbm (note: in the spreadsheet this was converted to 25 watts).