Z-Score and the Empirical Rule: Learn It 3

  • Calculate z-scores to explain the location of data points.
  • Compare observations using z-scores and the Empirical Rule.

A standardized value, or [latex]z[/latex]-score, is a statistical measure that describes how many standard deviations a data point is from the mean of the data set. It is used to standardize scores on different scales to a common scale with a mean of 0 and a standard deviation of 1.

[latex]z=\dfrac{\text{Observed Value}-\text{mean}}{\text{standard deviation}} = \dfrac{x-\mu}{\sigma}[/latex]

where [latex]x=[/latex] the value of the observation, [latex]\mu=[/latex] the population mean, [latex]\sigma=[/latex] the standard deviation, and [latex]z=[/latex] the standardized value (or [latex]z[/latex]-score).

Recall the data set about the movie runtimes for rated G (General Audiences, All Ages Admitted) and rated R (Restricted, Children Under 17 Require Accompanying Parent or Adult Guardian) movies.[1]

Step 1: Select the Single Group tab.

Step 2: Locate the dropdown under Enter Data and select From Textbook.

Step 3: Locate the drop-down menu under Dataset and select Movie Runtime (G Rated 1990-2016).

Step 4: Under Choose Type of Plot, select the options to create a Histogram and a Dotplot of runtime (in minutes).

[Trouble viewing? Click to open in a new tab.]


  1. What do movie ratings mean? (n.d.). Showbiz.Junkies. Retrieved from https://www.showbizjunkies.com/mpaa-ratings/