### Transcription of Lecture 5: Estimation - University of Washington

1 **Lecture** 5: **Estimation** Goals Basic concepts of **Estimation** Statistical approaches for estimating parameters Parametric interval **Estimation** Nonparametric interval **Estimation** (bootstrap). Central Dogma of Statistics Probability Population Descriptive Statistics Sample Inferential Statistics **Estimation** Estimator: Statistic whose calculated value is used to estimate a population parameter, ". Estimate: A particular realization of an estimator, " . ! Types of Estimators: - point estimate: single number that can be regarded ! as the most plausible value of ". - interval estimate: a range of numbers, called a confidence interval indicating, can be regarded as likely containing the true value of ". ! Properties of Good Estimators In the Frequentist world view parameters are fixed, statistics are rv and vary from sample to sample ( , have an associated sampling distribution). In theory, there are many potential estimators for a population parameter What are characteristics of good estimators?

2 Statistical Jargon for Good Estimators Good Estimators Are: Consistent: As the sample size increases " gets closer to ". ( ). lim P $ % $ > & = 0. n "#. ! ! Unbiased: E[" ] = ". ! Precise: Sampling distribution of " should have a small !standard error ! Bias Versus Precision Precise Imprecise Biased Unbiased Methods of Point **Estimation** 1. Method of Moments 2. Maximum **likelihood** 3. Bayesian Methods of Moments Advantage: simplest approach for constructing an estimator Disadvantage: usually are not the best . estimators possible Principle: Equate the kth population moment E[Xk] with the kth sample 1. moment " X ik and solve for the unknown parameter n n ! Method of Moments Example How can I estimate the scaled population mutation rate: " = 4N e . Brief (very brief) expose of coalescent theory: Coalescent times follow a geometric distribution T2 ! 4N. E[Ti ] =. time T3 i(i "1). T4. n Tc = " iTi i= 2. ! Method of Moments Example n E[Tc ] = " iE[Ti ]. i= 2. ! Method of Moments Example n n 4Ni E[Tc ] = " iE[Ti ] =".

3 I= 2 i= 2. i(i #1). ! Method of Moments Example n n n 4Ni 1. E[Tc ] = " iE[Ti ] =" = 4N ". i= 2 i= 2. i(i #1) i= 2. i #1. ! Method of Moments Example n n n 4Ni 1. E[Tc ] = " iE[Ti ] =" = 4N ". i= 2 i= 2. i(i #1) i= 2. i #1. E[Sn ] = E[Tc ]. ! n 1. E[Sn ] = 4N #. i= 2. i "1. ! n Sn 1 mom " =. E[Sn ] = " $ n 1. i= 2. i #1 $ i #1. ! i= 2. ! ! Methods of Point **Estimation** 1. Method of Moments 2. Maximum **likelihood** 3. Bayesian Introduction to **likelihood** Before an experiment is performed the outcome is unknown. Probability allows us to predict unknown outcomes based on known parameters: P(Data | " ). For example: ! n x n"x P(x | n, p) = ( ) p (1" p). x ! Introduction to **likelihood** After an experiment is performed the outcome is known. Now we talk about the **likelihood** that a parameter would generate the observed data: L(" | Data). L(" | Data) = P(Data | " ). For example: ! n x n"x L( p | n, x) = ( ) p (1" p). x ! **Estimation** proceeds by finding the value of " that makes the observed data most likely !

4 ! Let's Play T/F. True or False: The maximum **likelihood** estimate (mle) of ". gives us the probability of " . False - why? ! ! True or False: The mle of " is the most likely value of " . False - why? ! ! True or False: Maximum **likelihood** is cool Formal Statement of ML. Let x1, x2, , xn be a sequence of n observed variables Joint probability: P(x1, x2, , xn | " ) = P(X1=x1)P(X2=x2) P(Xn=xn). n = " P(X i = x i ). i=1. ! **likelihood** is then: n L( " | x1!, x2, , xn ) = " P(X i = x i ). i=1. n Log L( " | x1, x2, , xn ) = " log[P(X i = x i )]. ! ! i=1. MLE Example I want to estimate the recombination fraction between locus A and B from 5 heterozygous (AaBb) parents. I examine 30. gametes for each and observe 4, 3, 5, 6, and 7 recombinant gametes in the five parents. What is the mle of the recombination fraction? Probability of observing X = r recombinant gametes for a single parent is binomial: n P(X = r) = ( r )" r (1# " ) n#r ! MLE Example: Specifying **likelihood** Probability: P(r1, r2, , rn | " , n) = P(R1 = r1)P(R2 = r2) P(R5 = r5).

5 N r1 n#r1 n r1 n#r2 n P(r1, r2, , rn | " , n) = ( )" r1(1# " ) (r )" 2. (1# " ) .. (r )" r1 (1# " ) n#r5. 5. ! **likelihood** : ! ! 5. L(" | r1, r2, , rn , n) = " (nr )# r (1$ # ) n$r i i i i=1. 5. n ! Log L = " ri) + ri log# + (n $ ri )log(1$ # ). log(. ! i=1. ! MLE Example: Maximizing the **likelihood** Want to find p such that Log L is maximized 5. n Log L " ( ri ) + ri log # + (n $ ri )log(1$ # ). = log i=1. How? Graphically 1. ! 2. Calculus 3. Numerically MLE Example: Finding the mle of p ". Methods of Point **Estimation** 1. Method of Moments 2. Maximum **likelihood** 3. Bayesian World View According to Bayesian's The classic philosophy (frequentist) assumes parameters are fixed quantities that we want to estimate as precisely as possible Bayesian perspective is different: parameters are random variables with probabilities assigned to particular values of parameters to reflect the degree of evidence for that value Revisiting Bayes Theorem P(B | A)P(A). P(A | B) =. P(B). n Discrete P(B) = " P(B | Ai )P(Ai ).

6 ! i=1. Continuous P(B) = " P(B | A)P(A)dA. ! ! Bayesian **Estimation** In order to make probability statements about " given some observed data, D, we make use of Bayes theorem f (" ) f (D | " ) f (" )L(" | D). f (" | D) = =. f (D) ! #" f (" ) f (D | " )d". !"#$%&'"&(!()'*%)'+"",(-(.&'"&. ! The prior is the probability of the parameter and represents what was thought before seeing the data. The **likelihood** is the probability of the data given the parameter and represents the data now available. The posterior represents what is thought given both prior information and the data just seen. Bayesian **Estimation** : Simple Example I want to estimate the recombination fraction between locus A and B from 5 heterozygous (AaBb) parents. I examine 30. gametes for each and observe 4, 3, 5, 6, and 7 recombinant gametes in the five parents. What is the mle of the recombination fraction? Tedious to show Bayesian analysis. Let's simplify and ask what the recombination fraction is for parent three, who had 5.))

7 Observed recombinant gametes. Specifying The Posterior Density f (" ) f (r = 5 | ", n = 30). f (" | n = 30, r = 5) = # f (r = 5 | ", n = 30) f (" ) d". 0. prior = f (" ) = uniform[0, ] = ! 30 ri n#ri **likelihood** = P(r = 5 | ", n = 30) = (5 )" (1# " ). !normalizing constant = # P(r = 5 | ", n = 30) f (" ) d". 0. ! ! 30 5 25. = (5 ) # " (1- " ) d" ! 6531. 0. ! Specifying The Posterior Density f (" ) f (r = 5 | ", n = 30). f (" | n = 30, r = 5) = # f (r = 5 | ", n = 30) f (" ) d". 0. 5 25. (30. 5 )" (1# " ). f (" | n = 30, r = 5) =. 6531. ! Ta da . ! f (" | n = 30, r = 5). ! ". Interval **Estimation** In addition to point estimates, we also want to understand how much uncertainty is associated with it One option is to report the standard error Alternatively, we might report a confidence interval Confidence interval: an interval of plausible values for the parameter being estimated, where degree of plausibility specifided by a confidence level . Interpreting a 95% CI. We calculate a 95% CI for a hypothetical sample mean to be between and Does this mean there is a 95%.

8 Probability the true population mean is between and NO! Correct interpretation relies on the long-rang frequency interpretation of probability . Why is this so?