NB-IoT vs. LTE-M in practice
TL; DR
You might get the same or even better deep building penetration with CaT-M as you get with Cat-NB.
Intro
Internet has plethora of good resources about LTE-M (or Cat-M) and NB-IoT (or Cat-NB) or whatever you want to call them. They describe the differences and performance of these two popular IoT LPWAN technologies along with their pros and cons. Unfortunately most(if not all) of these resources contain the magical word “simulated”. It is very hard to find any real comparison about how these radios perform in real life IoT use cases, so I decided to give it a try.
Background
We are a startup. Which essentially means that we don’t have any money and even less time. Everything needs to be ready yesterday and with minimum effort. In this Cat-M/Cat-NB case it means that we basically picked Cat-NB to be our radio protocol, because internet and 3GPP says that penetration inside buildings is better. 5 minutes and good decision made. Or was it?
Now when device is working and doing what it is suppose to do(measuring air quality), we started to optimize the power consumption. When doing it, we noticed that in our case NB-IoT actually consumes 2 times more power than Cat-M. Hmm.
This happens because of protocols stacks we use, encryption, DTLS resumptions, re-transmissions timeouts and other stuff. Details of that is another story. For this article, let’s just say that we send too much data to let Cat-NB to show it’s true power. In practice, this means that typical power consumption curves while communicating with cloud server, looks like this.

Yellow being Cat-NB and green being Cat-M. You can see that energy consumed during this one data transfer burst is roughly 240uWh/140uWh in favor of Cat-M. That is a big difference. Could mean something like 8 months or 12 months without charging in our case.
So the question raised, should/could/can we switch to use Cat-M by default without loosing any coverage?
Test goal
Find out how much better Cat-NB performs in hard radio conditions compared to Cat-M. Idea is to find the sweet spot when radio stops working.
Test setup
Connect two identical devices to same cell tower using same band(as we know that 800MHz band goes through a concrete a bit differently than 2GHz band). Then move devices gradually to worse radio conditions and see what happens. Exciting for a nerd.
It turned out that finding a good test location was actually the biggest challenge. I tried few basements and all the pits I could find, even an old well. This thing just works everywhere. Annoying. Finally I decided to put the devices to an old aluminium case to disturb the signal a bit and then drop them to a concrete hole under ground inside a building. And yes, you can argue how real life use case is this. At least it is not a mathematical simulation.

Checklist:
* Two identical devices, one using Cat-NB and other using Cat-M
* Lock to same cell tower
* Lock to same band(20, 800MHz)
* Mechanism to see RSRP values and monitor when the connection drops
* Location with poor NW coverage
Results
Let’s first look at how to drop a connection in RF box. Green is always Cat-M and yellow Cat-NB.

First devices are on a table. Then they are moved to table size RF box and door closed little by little. No surprises here, Cat-M RSRP drops much faster and it looses connection around -130dBm when Cat-NB is still having -115dBm signal level. Cat-NB finally also drops at -125dBm. We are good to go to a real world test.

Next I used scientific instruments called plastic bag and a rope. I started above an old well about 1 meter above ground. Then I declined devices as far as I could without touching the water. That was about 3m(or 10ft) below ground level(water was very high).
a Bit surprisingly, RSRP levels were pretty much on par all the time. Cat-NB device lost few packets, but ultimately both still worked.
Then I had to fallback to aluminium case and hole on the floor.

Results are a bit disturbing. Signal first drops to around -120dBm when case is closed and when case is declined, Cat-NB first looses few packets and then finally connection is lost for good. Cat-M still works ~20cm lower and then it dies too.
I repeated all the tests and results remain the same.
Conclusions
It is hard to say anything for sure with this amount of testing. But what I can say is that this is definitely not as straight forward as I thought. I assumed that Cat-NB would outperform Cat-M easily, I just wated to know how much. But results hint that it might be the other way around.
I know that it is very hard to make fair comparison. There are 200 things in device and network configuration that could mess things up. But the fact is that in my test location, with the operator that I used, with the devices that I have, this is the real life result. Cat-M beats Cat-NB.
What I will now do is
#define DEFAULT_SYSTEM_MODE SYSTEM_MODE_CATM
and see what happens statistically for our device connectivity.
Disclaimer
I am a programmer. All I know about radio signals is that they are magical and I don’t understand anything about them. If a cat farts in a room, you might get 10dB more or less, probably depending from color of the curtains.