The Kelly Criterion

I have an open secret I’d like to share: I love gambling. It’s one of my favorite pastimes. If you don’t like gambling that’s fine. We’ll probably never be best friends, but we can get along. Now if you’re the type of person who will judge people that enjoy gambling, that’s where we’re going to start running into problems. How would you like it if I judged you for loving Judd Apatow movies? You’d probably think I’m a jerk. Well bug off, this post isn’t for you!

So there are a couple things you should know about me. I’m a mathematics enthusiast (obviously), but I am also born and raised in Vegas (lived there for the first 27 years of my life). I have also worked for the casino gaming industry as a game designer and mathematician. Let me tell you something: You CAN’T win; not in the long run at least. I’m sure most of you know this, but there is still a small minority of people that think they have it figured out; some system (martingale betting being the most common mistake among the less savvy gamblers). So if I can’t win then why do I enjoy it so much? Well, maybe I’m just an adrenaline junkie. It’s a roller-coaster ride. I don’t even know if I enjoy the money so much as the high beating the odds against a rigged system. And I’m happy to pay a (reasonable) premium for that thrill.

Now don’t get me wrong, some people can beat the system. See Ed Thorp or Jeff Ma who applied betting systems for beating blackjack. Or Haralabob Voulgaris who has made a very nice living off betting on NBA games. Then there’s the endless amount of poker pros such as Phil Ivey and Daniel Negreanu–I’m sure most of them will tell you that poker isn’t gambling, and I’m inclined to agree, but we still associate poker with the casino.

No matter what kind of gambling you’re in to you must at least be familiar with one single formula: The Kelly Criterion. And I am using gambling in the absolutely loosest sense of the word. The Kelly Criterion can actually be applied to any quantifiable stake where your return depends on an uncertain outcome. This includes poker, the stock market, or evaluating a job offer!

The formula is given by:

$f = \frac{p(b + 1) - 1}{b} = p - \frac{1-p}{b}$

Let $p$ be the probability of winning a bet with payoff odds of $b$ (i.e. for every dollar you place on a bet, you win back $b\$. Then $f$ is the percentage of your bankroll that you should place on that bet in order to maximize your long run returns. Note: A negative value implies you should place your stake on the other side of that bet.

The first thing I noticed about this equation was that it seemed way too conservative. For example, take the limit of $b\rightarrow\infty$. We get

$\lim_{b\rightarrow\infty} f = p$

If we set $p=\frac{1}{2}$ we get $f\rightarrow\frac{1}{2}$. So no matter how large your payoff is for a 50-50 outcome, you should never risk more than half of your current bankroll. This flies in the face of a cursory examination of expected value. Say we get paid a 100 to 1 on a coin flip. The expected return for any dollar we place into this wager is $\frac{1}{2} (\ 100) + \frac{1}{2}(-\1) = \49.5$, so shouldn’t we place as much money as possible into this bet to maximize our returns? Absolutely not! And the reason is risk of ruin.

You don’t want to be put in a position where you can lose your entire bank-roll, even with a near infinite payoff. After repeated trials you’re still eventually going to lose everything and you’re going to have to start again from 0.

So clearly any fraction of your bankroll less than 100% will avoid risk of ruin. Why not bet 90% of our bankroll on a positive EV bet? Well, it was derived that this ratio will maximize your long run rate of return. I have to admit it still feels overly conservative, but that’s one of the beautiful things about math: It doesn’t care how you feel. You start with your assumptions, you go through the motions, and you observe the results.

One side effect of the Kelly Criterion is it also makes a great case for diversification. Don’t put your entire bankroll on any single investment! No matter how great the return may look, if there’s even a 1% chance it could go bust you have to put some of your money somewhere else!

So let me ask a question: Can you think of any exceptions to using the Kelly Criterion? When might you want to be more or less aggressive? No right or wrong answer here. Just curious what you can come up with.

Net Present Value

Honestly, this equation doesn’t tickle me like others do. It’s practical and it’s good to know, but it’s a little too cold-hearted for my tastes. There’s just no love in it. 😦 However I still really want to talk about as a lead up to my next post which will be about the Bellman Equation.

Before we dive deeper into the Bellman Equation let’s look at how you would evaluate some investment $s$, where the best return on alternative investment is given by $1+i$:

$NPV(s) = \sum_{t=0}^{\infty} \frac{R_t}{(1+i) ^{t}}$

$NPV(s)$ is the present value of some investment $s$.

What this is trying to tell us is that we shouldn’t just look at the nominal returns of some asset, but we should discount any future returns by how far out in the future we receive it. Think of it this way: A dollar today is more valuable than a dollar tomorrow. Why? We could invest that dollar right now and receive some (small) return. Or the dollar might not be as valuable tomorrow because of inflation. Or we could just straight up get hit by a comet by the time we get to enjoy the fruits of our newfound fortune! That’s what the discount rate $\frac{1}{1+i}$ is trying to factor in: Our value of the present over the value of the future.

Now I’m going to ask you a question to help that discount rate sink in: How much would you pay for a million bucks in your bank account right now? That sounds kind of stupid, right? You’d probably pay any value up to a million bucks. How about a year down the road? Or 50 years down the road? Try to derive a value  $\frac{1}{1+i}$ based on how much you would pay to receive that money in the future.