Because technically the steam is dissolved in the water above its saturation point, right? If the gas were (say) CO2 instead of steam, wouldn’t “supersaturated” be the correct term?
If the gas were CO2 instead of steam, CO2 would be the solute, and water the solvent. In that case the term supersaturated would make sense because the solvent contains more solute than it can handle under normal conditions. The steam is not disolved in the water. The microwaved water is unable to form steam in the first place due to a lack of available nucleation sites.
I thought superheating referred to heating water over 100º by pressurizing it
That isn’t superheating. The boiling point of a given substance naturally varies with pressure. Liquid water at 200oC while pressurized sufficiently isn’t superheated. It is just hotter than you expect it to be. That technique can be used to superheat something like water if heated over it’s atmospheric boiling temperature while pressurized and then lowering the pressure without agitating it.
Explain to me how lab statistics would help? Then explain how you could generate useful statistics using the extremely non-standardized settings of every unique household in the world?
Now that the absurdity of wanting statistics is set aside… Microwave caused superheating of water is a well studied and understood phenomenon. There are things that reduce the likelihood, sure: air bubbles created by modern low-flow taps, general impurity of tap water, and scratches in used containers all provide nucleation points and reduce the likelihood of superheating.
All it takes is jossling a new mug so the air bubbles all float out, with a particularly clean supply of city water (or filtered is a common culprit) and that thing you’ve been doing for years blows up in your face at 105oC.