Gambling on Growth |
|
|
|
|
It's around 5:30PM and I'm riding the Little Compton bus currently about half way home. I had to review Joel's presentation slides on the bus in this morning so I didn't get around to thinking any more about models for spiders. Joel's presentation went fine and the faculty agreed that his proposal was suitable for a thesis if he successfully completes the experiments that he outlined in his talk. Officially, today was the last day of Artemis, but tomorrow all of the students and their families will be coming to the department for a celebration of the students' achievements. I probably won't get back to spider models until Sunday at the earliest.
I was off on another tangent today prompted by my scanning the generally bad economic news and talking with my sister on the phone about her various investment plans. I was struck with how difficult most people find making basic investment decisions: Should I rent a house or should I buy one? Should I lease a car, buy a new car or buy a used one? If I buy should I get a loan? This got me thinking about how humans understand physical processes and in particular random processes and various kinds of growth processes. When mathematicians talk about processes they generally are concerned with the evolution of discrete and continuous variables over time, for example, the value of a stock, the amount of money in your checking account, the average daily humidity or temperature. A random process is simply a process that is subject to randomness or unpredictability of the sort that you encounter when you flip a coin and are interested in whether it comes up heads or tails. A growth process concerns a variable that increases (it might also decrease) over time according to some mathematical rule.
Humans are notoriously bad in reasoning about some kinds of random processes and statistical information more generally (see [Paulos, 1995] for some interesting accounts of how people interpret statistical information and reason about probabilistic phenomena). Most people (myself included on more occasions than I like to admit, even though I "know better") fall victim to the gambler's paradox. Here are some examples of fallacies related to the gambler's paradox: "I got a parking ticket just yesterday; I couldn't possibly get another one today" or "I've been feeding this slot machine quarters for over an hour now, it's due for a jackpot". The error is to assume that the governing random processes have memory so that, for example, if I flip a coin ten times and it comes up heads each time, then the coin somehow knows this and has a pent up urge to rectify the imbalance (after all, so the fallacious reasoning goes, a fair coin is supposed to come up heads approximately the same number of times that it comes up tails).
You'd think that errors in our judgments concerning statistical information and probabilistic phenomena would put us at a disadvantage when it comes to survival. However, there is a good argument that our basic reasoning skills, while sometimes flawed from the standpoint of mathematical correctness, generally come to the right conclusions at least as regards the sorts of situations that we have found ourselves in throughout most of our evolutionary history (see [Pinker, 1997] for some insights into how people reason about logic and probability). There are, of course, circumstances that twenty-first century humans find themselves in that warrant a more careful and mathematically correct approach to probability and statistics. But I wasn't thinking primarily about about probability and statistics this morning; I was thinking about how people think about compound interest, inflation and the money that they have accumulating interest in the bank or dribbling away if they have a negative balance on their credit card.
Investors are repeatedly reminded by brokerage firms and investment houses about the power of compounding and exponentiation (our discussion about analyzing algorithms on July 14, 2002 concerned how the computations required by some algorithms scale exponentially with size of the input to the algorithm). On the one hand, inflationary factors can reduce the buying power of your money to nearly nothing in a very short time. This is supposed to scare you. To counteract this tendency, you are advised to invest your savings in financial instruments that have a return which is, on average, greater than inflation. Some brokerage houses would have you believe that the only investments having this property are stocks and bonds. In an article that I read this morning aimed at encouraging college students to invest their money, Kadie Bye of the University of Virginia writes,
Based on the miracle of compounding interest, money can grow at an exponential rate when earning interest on itself. For example, by investing $1 today at 10 percent, that compounds to $1.10 next year. To put it in perspective, if that earned interest reinvests itself, that $1 today will amount to $117.39 in 50 years. No wonder Ben Franklin called compounding interest the "eighth wonder of the world".
- from Early Investing Garners Huge Rewards For Students, by Kadie Bye.
As I read this article, I typed the following incantation to a shell
running Mathematica: (In[n]:=is the prompt
indicating the nth input and Out[n] =
indicates the output of the nth input if such an output
exists. There won't be any output if the last typed character in the
input line is ; as it is in the case of the first input
below. Take my word for it that there is a consistent logic as to why
Mathematica uses := for inputs and = for
outputs.)
In[1]:= x = 1.0 ; r = 0.10 ; n = 50;
where x is the initial investment, r is the
rate of interest and n is the number of years the money
is invested. I then typed in a simple program simulating the passing
of 50 years in which the student reinvests all of her earnings:
In[2]:= y = x ; Do[y = y * (1 + r),{n}]; y
for which I was rewarded with
OUT[2] = 117.391
confirming Kadie's conclusion. I also remembered a simpler method of calculation and typed the following equation
In[3]:= ((((x * r) * r) * r) * r) == x * r4
to which Mathematica returned
Out[3] = True
confirming the simpler method of calculating Kadie's 50-year return
In[4]:= x * (1 + r)n Out[4] = 117.391
This equation and alternative method of calculation makes clear a
number of other properties of compounding: For instance, the length of
time it will take to double your money for a fixed rate of interest is
independent of x. In the case of 0.10 interest, your
money will double approximately every seven and a quarter years. I
asked Mathematica to determine this for me by numerically solving
(NSolve) the equation, (1 + 0.10)d ==
2.0, in response to the following invocation.
In[5]:= NSolve[(1 + 0.10)d == 2.0,{d}]
Out[5] = {{d -> 7.27254}}
It's also interesting to plot the function, x * (1 + r)z
In[6]:= Plot[x * (1 + r)z,{z,0, 50}, PlotRange -> All] ;
|
and compare with, say, a quadratic (e.g., x2) function.
In[7]:= Plot[z2,{z,0, 50}, PlotRange -> All] ;
|
If you were simply to reinvest and compound your interest, you'd have close to $120 after fifty years but if the bank were to multiply your investment times the square of the number of years invested you'd get $2500 back for your initial one dollar investment. However things look different after 100 years.
In[8]:= Plot[x * (1 + r)z,{z,0, 100}, PlotRange -> All] ;
|
In[9]:= Plot[z2,{z,0, 100}, PlotRange -> All] ;
|
In this case, compound interest returns approximately $13,780 while the quadratic deal returns only $10,000. The disparity is even more pronounced if we consider a longer time frame; in 200 years, compound interest returns approximately 1.89 * 108 or 189,000,000 dollars compared to only 40,000 dollars for the quadratic case. Clearly compounding is the best deal if you can wait around long enough.
Exponential growth is great if you're talking about your investments. However, if in order to keep track of n widgets on an assembly line or search character strings of length n, you need to perform 2n primitive computer operations you may be in trouble even if you can perform a million or billion such operations per second. A string corresponding to a line in a file might contain 80 characters. Suppose that in order to match a particular pattern against such a string you needed to perform 280 = 1,208,925,819,614,629,174,706,176 operations each of which take a billionth of a second. It would take (280 * 1/(1 * 109))/(365 * 24 * 60 * 60) (approximately 40 million) years to perform such a calculation.
I'm in the (nerdy) habit of counting laps when I swim and counting repetitions when I do calisthenics in powers of two. For example, a 100 meter individual medley (IM) consists of 22 lengths, butterfly, back, breast and crawl, of a 25 meter pool and I generally do 22 100 meter IMs as part of my regimen. I swim for 1/2 = 2-1 hour 22 times per week and try to complete 26 lengths each time. As I count out the laps, I use integer powers to set goals, 2, 4, 8, 16, 32, where each goal is twice as hard to achieve as the last. I'm stuck at 64 lengths and unwilling to try for the next goal knowing that I'd fail most days for lack of time if for no other reason. In some cases, this exponential program provides just the right incentive; I'm typically not tired after 16 sit ups and so I'm usually game to go for 25 = 32, but, in other cases, such as swimming 64 lengths of the pool, I'm not willing to go the extra mile (almost literally as 64 * 25 = 1600 meters = 0.9942 miles). I get a visceral experience of the power of doubling every time I exercise.
If this were a class I was teaching, I'd ask for the best illustration of exponential doubling. Have you heard the one about placing grains of wheat on a chess board? You put one grain on the first square, two on the second, four on the third, eight on the fourth and so on until you've filled all 26 = 64 squares. Calculate how many dump trucks you'd need to haul off 226 grains of wheat. But it's not a class and perhaps you'd be better off taking a closer look at your buying habits, credit-card debt and plans for paying off student loans.