Splitting a Poisson Distribution

We consider a remarkable property of the Poisson distribution that has a connection to the multinomial distribution. We start with the following examples.

Example 1
Suppose that the arrivals of customers in a gift shop at an airport follow a Poisson distribution with a mean of \alpha=5 per 10 minutes. Furthermore, suppose that each arrival can be classified into one of three distinct types – type 1 (no purchase), type 2 (purchase under $20), and type 3 (purchase over $20). Records show that about 25% of the customers are of type 1. The percentages of type 2 and type 3 are 60% and 15%, respectively. What is the probability distribution of the number of customers per hour of each type?

Example 2
Roll a fair die N times where N is random and follows a Poisson distribution with parameter \alpha. For each i=1,2,3,4,5,6, let N_i be the number of times the upside of the die is i. What is the probability distribution of each N_i? What is the joint distribution of N_1,N_2,N_3,N_4,N_5,N_6?

In Example 1, the stream of customers arrive according to a Poisson distribution. It can be shown that the stream of each type of customers also has a Poisson distribution. One way to view this example is that we can split the Poisson distribution into three Poisson distributions.

Example 2 also describes a splitting process, i.e. splitting a Poisson variable into 6 different Poisson variables. We can also view Example 2 as a multinomial distribution where the number of trials is not fixed but is random and follows a Poisson distribution. If the number of rolls of the die is fixed in Example 2 (say 10), then each N_i would be a binomial distribution. Yet, with the number of trials being Poisson, each N_i has a Poisson distribution with mean \displaystyle \frac{\alpha}{6}. In this post, we describe this Poisson splitting process in terms of a “random” multinomial distribution (the view point of Example 2).

________________________________________________________________________

Suppose we have a multinomial experiment with parameters N, r, p_1, \cdots, p_r, where

  • N is the number of multinomial trials,
  • r is the number of distinct possible outcomes in each trial (type 1 through type r),
  • the p_i are the probabilities of the r possible outcomes in each trial.

Suppose that N follows a Poisson distribution with parameter \alpha. For each i=1, \cdots, r, let N_i be the number of occurrences of the i^{th} type of outcomes in the N trials. Then N_1,N_2,\cdots,N_r are mutually independent Poisson random variables with parameters \alpha p_1,\alpha p_2,\cdots,\alpha p_r, respectively.

The variables N_1,N_2,\cdots,N_r have a multinomial distribution and their joint probability function is:

\displaystyle (1) \ \ \ \ P(N_1=n_1,N_2=n_2,\cdots,N_r=n_r)=\frac{N!}{n_1! n_2! \cdots n_r!} \ p_1^{n_1} p_2^{n_2} \cdots p_r^{n_r}

where n_i are nonnegative integers such that N=n_1+n_2+\cdots+n_r.

Since the total number of multinomial trials N is not fixed and is random, (1) is not the end of the story. The probability in (1) is only a conditional probability. The following is the joint probability function of N_1,N_2,\cdots,N_r:

\displaystyle (2) \ \ \ \ P(N_1=n_1,N_2=n_2,\cdots,N_r=n_r)

          \displaystyle \begin{aligned}&=P(N_1=n_1,N_2=n_2,\cdots,N_r=n_r \lvert N=\sum \limits_{k=0}^r n_k) \\&\ \ \ \ \ \times P(N=\sum \limits_{k=0}^r n_k) \\&\text{ } \\&=\frac{(\sum \limits_{k=0}^r n_k)!}{n_1! \ n_2! \ \cdots \ n_r!} \ p_1^{n_1} \ p_2^{n_2} \ \cdots \ p_r^{n_r} \ \times \frac{e^{-\alpha} \alpha^{\sum \limits_{k=0}^r n_k}}{(\sum \limits_{k=0}^r n_k!)} \\&\text{ } \\&=\frac{e^{-\alpha p_1} \ (\alpha p_1)^{n_1}}{n_1!} \ \frac{e^{-\alpha p_2} \ (\alpha p_2)^{n_2}}{n_2!} \ \cdots \ \frac{e^{-\alpha p_r} \ (\alpha p_r)^{n_r}}{n_r!} \end{aligned}

To obtain the marginal probability function of N_j, j=1,2,\cdots,r, we sum out the other variables N_k=n_k (k \ne j) in (2) and obtain the following:

\displaystyle (3) \ \ \ \ P(N_j=n_j)=\frac{e^{-\alpha p_j} \ (\alpha p_j)^{n_j}}{n_j!}

Thus we can conclude that N_j, j=1,2,\cdots,r, has a Poisson distribution with parameter \alpha p_j. Furrthermore, the joint probability function of N_1,N_2,\cdots,N_r is the product of the marginal probability functions. Thus we can conclude that N_1,N_2,\cdots,N_r are mutually independent.

________________________________________________________________________
Example 1
Let N_1,N_2,N_3 be the number of customers per hour of type 1, type 2, and type 3, respectively. Here, we attempt to split a Poisson distribution with mean 30 per hour (based on 5 per 10 minutes). Thus N_1,N_2,N_3 are mutually independent Poisson variables with means 30 \times 0.25=7.5, 30 \times 0.60=18, 30 \times 0.15=4.5, respectively.

Example 2
As indicated earlier, each N_i, i=1,2,3,4,5,6, has a Poisson distribution with mean \frac{\alpha}{6}. According to (2), the joint probability function of N_1,N_2,N_3,N_4,N_5,N_6 is simply the product of the six marginal Poisson probability functions.

A double application of the multinomial coefficients

Suppose there are four committees that are labeled M, I, S and P. Eleven candidates are randomly assigned to these four committees. Assume that each candidate can only be assigned to one committee. Consider the following two examples:

  1. How many ways can we randomly assign 11 candidates to these 4 committees such that Committee M consists of one member, Committee I consists of four members, Committee S consists of four members and Committee P consists of two members?
  2. How many ways can we randomly assign 11 candidates to these 4 committees such that one committee consists of one member, one committee consists of four members, another committee consists of four members and another committee consists of two members?

The first question is a straight application of the combinatorial technique of multinomial coefficients, namely, the number of ways to assign 11 candidates into four subgroups, one group consisting of one candidate (Committee M), one group consisting of four candidates (Committee I), one group consisting of four candidates (Committee S) and one group consisting of two candidates (Committee P). Note that the numbers in bold are used in the denominator of the multinomial coefficient below. The answer to the first question is:

    \displaystyle \frac{11!}{1! \ 4! \ 4! \ 2!}=34650

Note that the first question is a specific example of the second. The second question is also about assigning 11 candidates into four subgroups. But in the second question, the subgroups can rotate among the four committees, requiring a double use of the multinomial coefficients, once on the 11 candidates and a second time on the four committees. In other words, the first application of the multinomial coefficients is on the 11 objects to be distributed into four subgroups and the second instance is on the grouping the four subgroups. This technique of the double applications of the multinomial coefficients is a useful one in probability and combinatorics. For example, this technique can be applied in the occupancy problem (see chapter 2 section 5 in p. 38 in [1]). Another application is for calculating the probabilities of the hands in the game of poker dice (see Example 2 below).

Example 1
This is the second question indicated above. The answer is:

    \displaystyle \frac{11!}{1! \ 4! \ 4! \ 2!} \times \frac{4!}{1! \ 2! \ 1!}=34650 \times 12=415800

The second multinomial coefficient is 12 and is the number of ways to group 4 committees into three subgroups, one consisting of one committee (receiving one candidate), one consisting of two committees (receiving four candidates each) and one consisting of one committee (receiving two candidates). Note that the numbers in bold are used in the second multinomial coefficient above.

Example 2
Roll five fair dice. Assuming one of the dice is called die 1, another one is called die 2 and so on, the string 1,2,5,2,4 indicates the outcome that die 1 is a one, die 2 is a two and die 3 is a five and so on. There are 6^5=7776 different outcomes. How many of these outcomes have three scores appearing in such a way that one score appears three times, and each of the other two scores appears once?

The first multinomial coefficient is from a representative outcome, for example, the string of scores, 1,1,1,2,3. We find the number of ways to assign the five scores into three subgroups, one consisting the three scores of 1, one consisting the one score of 2 and one consisting the one score of 3. Note that the numbers in bold are in the denominator of the multinomial coefficient below.

The second multinomial coefficient is from the six scores (faces) of a die. Here, we find the number of ways to assign the six faces into three groups, one consisting of three faces that do not appear, one consisting two faces (each of which appears once) and one group consisting one face that appears three times. Note that the numbers in bold are in the denominator of the multinomial coefficient below.

Multiplying these two multinomial coefficients, we obtain:

    \displaystyle \frac{5!}{3! \ 1! \ 1!} \times \frac{6!}{3! \ 2! \ 1!}=20 \times 60=1200

Rolling five dice and obtaining a score appearing three times and two scores appearing once each is called “three of a kind” in the game of poker dice. Thus in this game, the probability of obtaining a “three of a kind” is:

    \displaystyle \frac{1200}{7776}=0.15432

Reference

  1. Feller, W., An Introduction to Probability Theory and its Applications, Vol. I, 3rd ed., John Wiley & Sons, Inc., New York, 1968

The multinomial coefficients

Consider the following four examples of counting:

  1. Consider the finite populaton \left\{m,i,s,p\right\}. Draw 11 letters at random from this population with replacement. The total number of 11-character letter strings that can be formed is 4^{11}. How many of these character strings contain one m, four i’s, four s’s and two p’s?
  2. What is the total number of 11-character strings that can be formed using the letters in the word Mississippi?
  3. Suppose each of the questions in an 11-question multiple choice test has 4 choices (M, I, S and P). A student chooses the answers by pure guessing. How many ways can this student take the test if the random selection of the answers produces 1 M, 4 I’s, 4 S’s and 2 P’s as answers?
  4. How many ways can we randomly assign 11 candidates into 4 committees such that Committee M consists of 1 member, Committee I consists of 4 members, Committee S consists of 4 members and Committee P consists of 2 members? Assume that each candidate can only be assigned to one committee.

All four examples have the same answer, namely 34,650. All these examples are about the number of ways of assigning 11 objects into four subgroups, one containing 1 object, two containing 4 objects each and one containing 2 objects, where the objects in each subgroup are considered indistinguishable. These four examples illustrate the combinatorial approach called multinomial coefficients. The multinomial coefficient (the number of ways of assigning the 11 objects in the specified manner) in these examples is:

    \displaystyle \frac{11!}{1! \ 4! \ 4! \ 2!}=34650

In this post, I make a few observations about the combinatorics surrounding the multinomial coefficients and the multinomial theorem. The multinomial distribution is then naturally defined.

First, the multinomial coefficients. Suppose k and n are positive integers such that n \le k. Suppose we have nonnegative integers a_1, a_2, \cdots, a_n such that a_1+a_2+\cdots+a_n=k. The following is the number of ways to partition a set of k distinct objects into n subgroups where the first subgroup consists of a_1 objects, the second subgroup consists of a_2 objects and so on.

    \displaystyle (0) \ \ \ \ \binom{k}{a_1, a_2, \cdots, a_n}=\frac{k!}{a_1! a_2! \cdots a_n!}

The result (1) is the result of a combinatoric observation below. The result (2) is the multinomial theorem.

    \displaystyle (1) \ \ \ \ \sum_{a_1+a_2+ \cdots + a_n=k} \frac{k!}{a_1! a_2! \cdots a_n!}=n^k
    \displaystyle (2) \ \ \ \ (x_1+x_2+ \cdots +x_n)^k=\sum_{a_1+a_2+ \cdots + a_n=k} \frac{k!}{a_1! a_2! \cdots a_n!} \ x_1^{a_1} x_2^{a_2} \cdots x_n^{a_n}

Discussion of Example
In drawing 11 letters at random with replacement from the set \left\{m,i,s,p\right\}, there are a total of 4^{11}=4194304 many ordered samples. The following two strings are two examples:

    Mississippi \ \ \ \ \ \ \ \ \ \ \ Mississiipp

Both of these ordered samples have one m, four i’s, four s’s and two p’s. In other words, both of these ordered samples have the same unordered sample \left\{m,i,i,i,i,s,s,s,s,p,p\right\}. A more compact way of notating this unordered sample is (1,4,4,2), where the first coordinate is the number of m’s, the second coordinate is the number of i’s, the third coordinate is the number of s’s, and the fourth coordinate is the number of p’s. Note that the sum of the four is 11. The number of ordered samples that are tied to (1,4,4,2) is the multinomial coefficient 34650 as indicated above.

Another way of denoting the unordered sample (1,4,4,2) is using a string of stars and bars as follows:

    * \ | \ * \ * \ * \ * \ | \ * \ * \ * \ * \ | \ * \ *

In the star and bar diagram, there are 11 stars (representing the 11 characters selected) and 3 bars (creating four spaces representing the letters m, i, s and p, respectively). The star and bar diagram has a combinatorial advantage. For example, how many unordered samples are there when 11 letters are selected at random with replacement from \left\{m,i,s,p\right\}?

In any unordered sample in our discussion, there are 3 bars and 11 stars. The stars and bars can be in any arbitary order. Thus there are \displaystyle \binom{11+3}{11}=364 many unordered samples. Furthermore, the equation a_1+a_2+a_3+a_4=11 has 364 many nonnegative integer solutions.

A related question is: how many unordered samples are there when 11 letters are selected at random with replacement from \left\{m,i,s,p\right\} such that each letter is selected at least once? In the star and bar diagram, each of the four spaces has at least one star. Thus we need to arrange 7 more stars in these four spaces. Thus there are \displaystyle \binom{7+3}{7}=120 many ways of doing this. Furthermore, the equation a_1+a_2+a_3+a_4=11 has 120 many positive integer solutions.

Generalization
Suppose we sample k times from the finite set \left\{x_1,x_2,x_3, \cdots, x_n\right\} with replacement. There are n^k many ordered samples. These ordered samples can be collapsed into

    \displaystyle (3) \ \ \ \ \binom{k+n-1}{k}=\binom{k+n-1}{n-1}

many unordered samples (this can be seen using the star and bar diagram). Furthermore, the number of nonnegative integer solutions to the equation a_1+a_2+ \cdots +a_n=k is obtained by (3).

The number of ordered samples tied to the unordered sample (a_1,a_2, \cdots, a_n) is:

    \displaystyle (4) \ \ \ \ \frac{k!}{a_1! a_2! \cdots a_n!} \ \ \ \ \text{(multinomial coeffcients)}

The preceding discussion is encapsulated in equation (1). The equation (2) above is the statement of the multinomial theorem, which is a statement about polynomial expansion. The number of terms in the polynomial expansion is indicated by (3), which is also the number of nonnegative integer solutions to the equation a_1+a_2+ \cdots +a_n=k. Each term x_1^{a_1} x_2^{a_2} \cdots x_n^{a_n} in the polynomial expansion can be considered as an unordered sample in the finite sampling with replacement. Thus both results of (1) and (2) are compact ways of describing the finite sampling with replacement from the set \left\{x_1,x_2,x_3, \cdots, x_n\right\}.

The number of unordered samples where each object in \left\{x_1,x_2,x_3, \cdots, x_n\right\} is selected at least once is:

    \displaystyle (5) \ \ \ \ \binom{k-n+n-1}{k-n}=\binom{k-n+n-1}{n-1}=\binom{k-1}{n-1}

As a result, (5) provides the number of positive integer solutions to the equation a_1+a_2+ \cdots +a_n=k.

The Multinomial Distribution
Suppose that k independent trials are performed where each trial has n outcomes denoted by E_1,E_2,\cdots,E_n. Suppose that in each trial, the probability of the outcome E_i is \displaystyle p_i. Furthermore, we have p_1+p_2+\cdots+p_n=1.

The result of performing the k independent trials can be denoted by an ordered string such as E_5 E_1 E_7 \cdots, i.e. an ordered sample when sampling k times from \left\{E_1,E_2,\cdots,E_n\right\} with replacement. Consider the ordered samples where the outcome E_1 occurs a_1 times, the outcome E_2 occurs a_2 times and so on such that a_1+a_2+\cdots+a_n=k. This is a scenario that is equivalent to the four examples indicated at the beginning of the post, which is about assigning k trials into n subgroups, the first of which consists of a_1 occurrences of the outcome E_1, the second subgroup consists of a_2 occurrences of the outcome E_2 and so on. The multinomial coefficient (4) above is the number of all such ordered samples. Thus the following is the probability that in k trials, E_1 occurs a_1 times, E_2 occurs a_2 times (and so on):

    \displaystyle \frac{k!}{a_1! \ a_2! \cdots a_n!} \ p_1^{a_1} \ p_2^{a_2} \cdots p_n^{a_n}

More formally, for each j=1,2, \cdots, n, let X_j be the number of times the event E_j occurs in the k many trials. Then the joint distribution of the random variables X_1,X_2,\cdots,X_n is called the multinomial distribution with parameters k,p_1,p_2,\cdots,p_n.

The joint probability density function (joint pdf) is given by:

    \displaystyle P(X_1=a_1,X_2=a_2, \cdots, X_n=a_n)=\frac{k!}{a_1! \ a_2! \cdots a_n!} \ p_1^{a_1} \ p_2^{a_2} \cdots p_n^{a_n}

The multinomial distribution is so named is because of the multinomial theorem. Note that the right-hand side of the above pdf is a term in the multinomial expansion of (p_1+p_2+\cdots+p_n)^k. Furthermore we have:

    \displaystyle 1=(p_1+p_2+ \cdots +p_n)^k=\sum_{a_1+a_2+ \cdots + a_n=k} \frac{k!}{a_1! a_2! \cdots a_n!} \ p_1^{a_1} p_2^{a_2} \cdots p_n^{a_n}

When there are only two categories of balls, labeled 1 (success) or 2 (failure), X_2=k-X_1. As a result, X=X_1 has a univariate distribution, which is the binomial distribution.

Reference

  1. Feller, W., An Introduction to Probability Theory and its Applications, Vol. I, 3rd ed., John Wiley & Sons, Inc., New York, 1968