Harmonic mean

From CEOpedia

Harmonic mean is a measure of central tendency calculated as the reciprocal of the arithmetic mean of reciprocals, particularly suited for averaging rates and ratios (Kenney J.F., Keeping E.S. 1962, p.54)[1]. When you drive 60 miles at 30 mph and return at 60 mph, your average speed isn't 45 mph. It's 40 mph—the harmonic mean. This counterintuitive result trips up students and professionals alike. The harmonic mean always produces the lowest value among the three Pythagorean means, and that property makes it uniquely valuable for specific applications.

Finance professionals use it to average price-to-earnings ratios. Physicists apply it to parallel circuit calculations. Machine learning practitioners rely on it for the F1 score that balances precision and recall. The harmonic mean quietly underpins more calculations than most people realize.

Mathematical definition

The harmonic mean of n positive numbers x₁, x₂, ..., xₙ is:

HM = n / (1/x₁ + 1/x₂ + ... + 1/xₙ)

Equivalently stated: the harmonic mean equals n divided by the sum of reciprocals. For two values a and b:

HM = 2ab / (a + b)

This two-value formula appears frequently in physics and engineering contexts[2].

Properties

Several mathematical properties distinguish the harmonic mean:

Inequality relationship. For any set of positive numbers, HM ≤ GM ≤ AM, where GM represents the geometric mean and AM the arithmetic mean. Equality holds only when all values are identical. This inequality (known as the AM-GM-HM inequality) provides useful bounds in optimization problems.

Sensitivity to small values. The harmonic mean gives greater weight to smaller numbers. A single small value dramatically pulls down the result. This sensitivity makes it inappropriate for data with zeros or near-zero values—the formula breaks down completely when any value equals zero.

Reciprocal relationship. The harmonic mean of a dataset equals the reciprocal of the arithmetic mean of the reciprocals. This algebraic symmetry simplifies certain derivations.

Geometric interpretation. Given two numbers a and b, their harmonic mean relates to parallel constructions in geometry. Ancient Greeks studied this relationship extensively[3].

Comparison with other means

Understanding when to use which mean matters:

Arithmetic mean. Sum divided by count. Appropriate for additive quantities—heights, weights, temperatures. When combining fixed quantities, use arithmetic mean.

Geometric mean. The nth root of the product of n numbers. Appropriate for multiplicative relationships—growth rates, ratios, percentages. When quantities compound, use geometric mean.

Harmonic mean. Reciprocal of arithmetic mean of reciprocals. Appropriate for rates and ratios with common numerators or denominators. When averaging speeds over fixed distances, use harmonic mean.

The choice isn't arbitrary. Using the wrong mean produces systematically biased results. A portfolio manager averaging P/E ratios with arithmetic mean overvalues the portfolio. A scientist averaging rates with arithmetic mean misrepresents the data[4].

Applications in finance

Financial analysts employ harmonic means in several contexts:

Price-earnings ratios

When averaging P/E ratios across stocks, the harmonic mean provides accurate portfolio valuation. Why? Consider two stocks:

  • Stock A: Price $100, Earnings $10, P/E = 10
  • Stock B: Price $100, Earnings $4, P/E = 25

Arithmetic mean P/E: (10 + 25) / 2 = 17.5 Portfolio earnings-weighted P/E: $200 total price / $14 total earnings = 14.3

The harmonic mean (2 × 10 × 25) / (10 + 25) = 14.3 matches the correct portfolio P/E. The arithmetic mean systematically overstates valuation[5].

Dollar-cost averaging

Investors buying fixed dollar amounts at varying prices achieve harmonic mean cost. Investing $1,000 monthly in a fund trading at $50, then $40, then $50:

Month 1: $1,000 buys 20 shares Month 2: $1,000 buys 25 shares Month 3: $1,000 buys 20 shares

Total: 65 shares for $3,000 = $46.15 average cost

Harmonic mean of prices: 3 / (1/50 + 1/40 + 1/50) = $46.15

The arithmetic mean ($46.67) would overstate the actual cost basis.

Other financial ratios

Price-to-book ratios, price-to-sales ratios, and similar metrics benefit from harmonic averaging when combining multiple companies or time periods.

Applications in science and engineering

Technical fields rely heavily on harmonic means:

Average speed

The textbook example. Driving 120 miles at 40 mph takes 3 hours. Returning at 60 mph takes 2 hours. Total: 240 miles in 5 hours = 48 mph average speed.

Harmonic mean: 2 / (1/40 + 1/60) = 48 mph

The arithmetic mean (50 mph) would incorrectly suggest the trip took less time than it actually did[6].

Parallel resistance

When resistors R₁ and R₂ connect in parallel, total resistance equals:

R_total = (R₁ × R₂) / (R₁ + R₂)

This is half the harmonic mean of the two resistances. Electrical engineers apply this constantly in circuit design.

Capacitors in series

The reciprocal situation—capacitors in series combine according to harmonic mean logic. Total capacitance follows the same formula structure.

Lens equations

The thin lens equation (1/f = 1/do + 1/di) embodies harmonic mean relationships. Focal length relates harmonically to object and image distances.

Applications in machine learning

Modern data science has elevated harmonic mean usage:

F1 score

The F1 score balances precision (P) and recall (R):

F1 = 2 × (P × R) / (P + R)

This is the harmonic mean of precision and recall. It penalizes classifiers that sacrifice one metric to inflate the other. A model with 99% precision but 1% recall achieves F1 = 1.98%—correctly identifying it as useless despite high precision[7].

Weighted F1

Multi-class classification problems weight F1 scores by class frequency. The macro F1 averages class-specific F1 scores; micro F1 pools predictions across classes.

Imbalanced datasets

Harmonic mean-based metrics help evaluate models on imbalanced data where accuracy misleads. Fraud detection with 0.1% positive rate would achieve 99.9% accuracy by predicting "no fraud" always—but F1 would correctly show poor performance.

Weighted harmonic mean

When observations carry different weights:

WHM = Σwᵢ / Σ(wᵢ/xᵢ)

Portfolio managers weighting stocks by market capitalization use this form. The weights represent the relative importance of each observation in the final average[8].

Computational considerations

Practical implementation requires care:

Zero values. The harmonic mean is undefined when any value equals zero. Data preprocessing must handle this limitation—often by substituting small positive values or excluding zeros.

Negative values. Strictly defined only for positive numbers. Some extensions exist for signed data, but they lack the intuitive interpretation of the positive case.

Numerical stability. Summing reciprocals of very large and very small numbers can cause floating-point precision issues. Robust implementations use logarithmic transformations when necessary.

Outlier sensitivity. Unlike arithmetic mean, which is sensitive to large outliers, harmonic mean is sensitive to small values. An unusually low observation pulls the harmonic mean down dramatically.

Historical context

The harmonic mean traces to antiquity:

Ancient Greece. Pythagoras and his followers studied musical harmonies that led to harmonic proportion concepts. The mean takes its name from these harmonic intervals—the relationship between string lengths producing consonant tones.

Mathematical development. Euclid discussed the three Pythagorean means in Elements. Renaissance mathematicians formalized the algebraic relationships.

Modern applications. Only in the 20th century did harmonic mean applications explode across physics, finance, and computing. The rise of statistical computing made calculation trivial.


Harmonic meanrecommended articles
Descriptive statisticsStatistical measuresFinancial analysisQuantitative analysis

References

Footnotes

  1. Kenney J.F., Keeping E.S. (1962), Mathematics of Statistics, p.54
  2. Ferger W.F. (1931), The Nature and Use of the Harmonic Mean, pp.36-40
  3. Kenney J.F., Keeping E.S. (1962), Mathematics of Statistics, pp.54-58
  4. Bodie Z., Kane A., Marcus A.J. (2018), Investments, pp.156-178
  5. Bodie Z., Kane A., Marcus A.J. (2018), Investments, pp.456-478
  6. Ferger W.F. (1931), The Nature and Use of the Harmonic Mean, pp.42-45
  7. Powers D. (2011), Evaluation Metrics, pp.37-63
  8. Kenney J.F., Keeping E.S. (1962), Mathematics of Statistics, pp.59-62

Author: Sławomir Wawak