Abstract Hot temperature extremes have increased substantially in frequency and magnitude over past decades. A widely used approach to quantify this phenomenon is standardizing temperature data relative to the local mean and variability of a reference period. Here we demonstrate that this conventional procedure leads to exaggerated estimates of increasing temperature variability and extremes. For example, the occurrence of ``two-sigma extremes’’ would be overestimated by 48.2% compared to a given reference period of 30 years with time-invariant simulated Gaussian data. This corresponds to an increase from a 2.0% to 2.9% probability of such events. We derive an analytical correction revealing that these artifacts prevail in recent studies. Our analyses lead to a revision of earlier reports: For instance, we show that there is no evidence for a recent increase in normalized temperature variability. In conclusion, we provide an analytical pathway to describe changes in variability and extremes in climate observations and model simulations. , Key Points Conventional normalization of spatiotemporal data sets with respect to a reference period induces artifacts Normalization-induced artifacts are most severe if variability or extremes are under scrutiny The study provides an analytical correction and accurate estimate of variability and extremes