In the previous pages, we concerned ourselves with finding the expectation of any general function \(u(X)\) of the discrete random variable \(X\). Here, we'll focus our attention on one particular function, namely:
\(u(X)=X\)
Let's jump right in, and give the expectation in this situation a special name!
 First Moment about the Origin

When the function \(u(X)=X\), the expectation of \(u(X)\), when it exists:
\(E[u(X)]=E(X)=\sum\limits_{x\in S} xf(x) \)
is called the expected value of \(X\), and is denoted \(E(X)\). Or, it is called the mean of \(X\), and is denoted as \(\mu\) (the greek letter mu, read "mew"). That is, \(\mu=E(X)\). The expected value of \(X\) can also be called the first moment about the origin.
Example 810 Section
The maximum patent life for a new drug is 17 years. Subtracting the length of time required by the Food and Drug Administration for testing and approval of the drug provides the actual patent life for the drug — that is, the length of time that the company has to recover research and development costs and to make a profit. The distribution of the lengths of actual patent lives for new drugs is as follows:
Years, y  3  4  5  6  7  8  9  10  11  12  13 

f(y)  0.03  0.05  0.07  0.10  0.14  0.20  0.18  0.12  0.07  0.03  0.01 
What is the mean patent life for a new drug?
Answer The mean can be calculated as:
\(\mu_Y=E(Y)=\sum\limits_{y=3}^{13} yf(y)=3(0.03)+4(0.05)+\cdots+12(0.03)+13(0.01)=7.9\)
That is, the average patent life for a new drug is 7.9 years.
Example 811 Section
Let \(X\) follow a hypergeometric distribution in which n objects are selected from \(N\) objects with \(m\) of the objects being one type, and \(Nm\) of the objects being a second type. What is the mean of \(X\)?
Solution
Recalling the p.m.f. of a hypergeometric distribution and using the definition of the expected value of \(X\), we have:
\(E(X)=\sum\limits_{x\in S} x \dfrac{\dbinom{m}{x} \dbinom{Nm}{nx}}{\dbinom{N}{n}}\)
You should be getting the idea already that this is going to be messy! So, we're going to work on it in parts. First, note that the first term of the summation equals 0 when \(x=0\). And, note that some of the terms can be written differently:
That is:
\(\dbinom{m}{x}=\dfrac{m!}{x!(mx)!}\)
and:
\(\dbinom{N}{n}=\dfrac{N!}{n!(Nn)!}=\dfrac{N(N1)!}{n \cdot (n1)!(Nn)!}=\dfrac{N}{n} \cdot \dfrac{(N1)!}{(n1)!(N1(n1))!}=\dfrac{N}{n} \cdot \dbinom{N1}{n1}\)
Therefore, replacing these quantities in our formula for \(E(X)\), we have:
My voice gets caught off at the end there, but we still managed to finish the proof in the nick of time! We've shown that, in general, the mean of a hypergeometric random variable \(X\), in which \(n\) objects are selected from \(N\) objects with \(m\) of the objects being one type, is:
\(E(X)=\dfrac{mn}{N}\)
Example 812 Section
Suppose the random variable \(X\) follows the uniform distribution on the first \(m\) positive integers. That is, suppose the p.m.f. of \(X\) is:
\(f(x)=\dfrac{1}{m}\) for \(x=1, 2, 3, \ldots, m\)
What is the mean of \(X\)?