This post shows how binary random variables can be defined by their mean, variance, and skewness. I use this fact to explain why variance does not (always) measure “riskiness.”
Suppose I’m defining a random variable . It takes value or , with . I want to have mean , variance , and skewness coefficient The target parameters uniquely determine via
For example, if I want to be symmetric (i.e., to have ) then I have to choose . Increasing the target skewness makes the upside larger but less likely, and the downside smaller but more likely:
This mapping between and is useful for generating examples of “risky” gambles. Intuition suggests that a gamble is less risky if its payoffs have lower variance. But Rothschild and Stiglitz (1970) define a gamble to be less risky than gamble if every risk averse decision-maker (DM) prefers to . These two definitions of “risky” agree when
- payoffs are normally distributed, or
- DMs have quadratic utility functions.
Under those conditions, DMs’ expected utility depends only on the payoffs’ mean and variance. But if neither condition holds then DMs also care about payoffs’ skewness. We can demonstrate this using binary gambles. Consider these three:
- Gamble 's payoffs have mean , variance , and skewness ;
- Gamble 's payoffs have mean , variance , and skewness ;
- Gamble 's payoffs have mean , variance , and skewness .
The means are the same but the distributions are different. Gamble gives me a random payoff , which equals with probability and otherwise. We can compute the using the target parameters and the formulas above:
Gamble | |||
---|---|---|---|
16.00 | 4.00 | 0.50 | |
72.31 | 7.69 | 0.04 | |
10.91 | 0.09 | 0.92 |
Gamble offers a symmetric payoff: its upside and downside are equally large and equally likely. Gamble offers a positively skewed payoff: a large but unlikely upside, and a small but likely downside. Gamble offers a negatively skewed payoff: a small but likely upside, and a large but unlikely downside.
These upsides and downsides affect my preferences over gambles. Suppose I get utility from receiving payoff . Then gamble gives me expected utility while gives me and gives me . So I prefer gamble to , even though 's payoffs have four times the variance of 's. I also prefer to , even though 's payoffs have sixteen times the variance of 's. How can I be risk averse—that is, have a concave utility function—but prefer gambles with higher variance? The answer is that I also care about skewness: I prefer gambles with large upsides and small downsides. These “sides” of risk are not captured by variance.
So is gamble “riskier” than gambles and ? Rothschild and Stiglitz wouldn’t say so. To see why, suppose my friend has utility function . Then gamble gives him expected utility , while gives him and gives him . My friend and I have opposite preferences: he prefers to to , whereas I prefer to to . But we’re both risk averse: our utility functions are both concave! Thus, it isn’t true that every risk-averse decision-maker prefers or to . Different risk-averse DMs have different preference rankings. This makes the three gambles incomparable under Rothschild and Stiglitz’s definition of “risky.”