Exponential
The Exponential distribution is a continuous distribution that is widely used as a simple model for the waiting time for a certain kind of event to occur, e.g., the time until the next email arrives.
Definition: Exponential Distribution
A continuous r.v. is said to have the Exponential distribution with parameter , where , if its PDF is
We denote this by .
The corresponding CDF is
The PDF and CDF are plotted in the following figure.
We've seen how all Uniform and Normal distributions are related to one another via location-scale transformations, and we might wonder whether the Exponential distribution allows this too. Exponential r.v.s are defined to have support , and shifting would change the left endpoint. But scale transformations work nicely, and we can use scaling to get from the simple to the general : if , then
sinc
Conversely, if , then . The Exponential distribution has a very special property called the memoryless property. If the waiting time for a certain event to occur is Exponential, then the memoryless property says that no matter how long you have waited so far, your additional waiting time is still Exponential (with the same parameter).
Definition: Memoryless Property
A distribution is said to have the memoryless property if a random variable from that distribution satisfies
for all .
Here represents the time you've already spent waiting; the definition says that after you've waited minutes, the probability you'll have to wait another minutes is exactly the same as the probability of having to wait minutes with no previous waiting time under your belt. Another way to state the memoryless property is that conditional on , the additional waiting time is still distributed .
Using the definition of conditional probability, we can directly verify that the Exponential distribution has the memoryless property. Let . Then
What are the implications of the memoryless property? If you're waiting at a bus stop and the time until the bus arrives has an Exponential distribution, then conditional on your having waited 30 minutes, the bus isn't due to arrive soon. The distribution simply forgets that you've been waiting for half an hour, and your remaining wait time is the same as if you had just shown up to the bus stop. If the lifetime of a machine has an Exponential distribution, then no matter how long the machine has been functional, conditional on having lived that long, the machine is as good as new: there is no wear-and-tear effect that makes the machine more likely to break down soon. If human lifetimes were Exponential, then conditional on having survived to the age of 80, your remaining lifetime would have the same distribution as that of a newborn baby!
Clearly, the memoryless property is not an appropriate description for human or machine lifetimes. Why then do we care about the Exponential distribution?
- Some physical phenomena, such as radioactive decay, truly do exhibit the memoryless property, so the Exponential is an important model in its own right.
- The Exponential distribution is well-connected to other named distributions. In the next section, we'll see how the Exponential and Poisson distributions can be united by a shared story.
- The Exponential serves as a building block for more flexible distributions, such as a distribution known as the Weibull, that allow for a wear-and-tear effect (where older units are due to break down) or a survival-of-the-fittest effect (where the longer you've lived, the stronger you get). To understand these distributions, we first have to understand the Exponential.
Poisson Processes
The Exponential distribution is closely connected to the Poisson distribution, as suggested by our use of for the parameters of both distributions. In this section we will see that the Exponential and Poisson are linked by a common story, the Poisson process.
Definition: Poisson Process
A process of arrivals in continuous time is called a Poisson process with rate if the following two conditions hold.
- The number of arrivals that occur in an interval of length is a random variable.
- The numbers of arrivals that occur in disjoint intervals are independent of each other. For example, the numbers of arrivals in the intervals and are independent.
A sketch of a Poisson process is pictured in the following figure. Each X marks the spot of an arrival.
For concreteness, suppose the arrivals are emails landing in an inbox according to a Poisson process with rate . There are several things we might want to know about this process. One question we could ask is: in one hour, how many emails will arrive? The answer comes directly from the definition, which tells us that the number of emails in an hour follows a distribution. Notice that the number of emails is a nonnegative integer, so a discrete distribution is appropriate.
But we could also flip the question around and ask: how long does it take until the first email arrives (measured relative to some fixed starting point)? The waiting time for the first email is a positive real number, so a continuous distribution on is appropriate. Let be the time until the first email arrives. To find the distribution of , we just need to understand one crucial fact: saying that the waiting time for the first email is greater than is the same as saying that no emails have arrived between 0 and . In other words, if is the number of emails that arrive at or before time , then
We call this the count-time duality because it connects a discrete r.v., , which counts the number of arrivals, with a continuous r.v., , which marks the time of the first arrival.
If two events are the same, they have the same probability. Since by the definition of Poisson process,
Therefore , so ! The time until the first arrival in a Poisson process of rate has an Exponential distribution with parameter .
What about , the time between the first and second arrivals? Since disjoint intervals in a Poisson process are independent by definition, the past is irrelevant once the first arrival occurs. Thus is independent of the time until the first arrival, and by the same argument as before, also has an Exponential distribution with rate . Similarly, independently of and . Continuing in this way, we deduce that all the interarrival times are i.i.d. random variables. To summarize what we've learned: in a Poisson process of rate ,
- the number of arrivals in an interval of length 1 is , and
- the times between arrivals are i.i.d. .
Thus, Poisson processes tie together two important distributions, one discrete and one continuous, and the use of a common symbol for both the Poisson and Exponential parameters is felicitous notation, for is the arrival rate in the process that unites the two distributions.
The story of the Poisson process provides intuition for the fact that the minimum of independent Exponential r.v.s is another Exponential r.v.
Example
Let be independent, with . Let . Show that and interpret this intuitively.
Solution: