The ﬁrst way is using the law Consistency is a relatively weak property and is considered necessary of all reasonable estimators. (van der Vaart, 1998, Theorem 5.7, p. 45) Let Mn be random functions and M be Omitted variable bias: violation of consistency From the omitted variable bias formula b 1!p 1 + 2 Cov (X i;W i) Var (X i) we can infer the direction of the bias of b 1 that persists in large samples Suppose W i has a positive effect on Y i, then 2 >0 Suppose X i and W … j βˆ • Thus, an unbiased estimator for which Bias(ˆ) 0 βj = -- that is, for which E(βˆ j) =βj-- is on average a Theorem 4. To compare the two estimators for p2, assume that we ﬁnd 13 variant alleles in a sample of 30, then pˆ= 13/30 = 0.4333, pˆ2 = 13 30 2 =0.1878, and pb2 u = 13 30 2 1 29 13 30 17 30 =0.18780.0085 = 0.1793. Example: Suppose X 1;X 2; ;X n is an i.i.d. Bias. As the bias correction does not aﬀect the variance, the bias corrected matching estimators still do not reach the semiparametric eﬃciency bound with a ﬁxed number of matches. We will prove that MLE satisﬁes (usually) the following two properties called consistency and asymptotic normality. Consistency. The bias for the estimate ˆp2, in this case 0.0085, is subtracted to give the unbiased estimate pb2 u. 1. Relative e ciency: If ^ 1 and ^ 2 are both unbiased estimators of a parameter we say that ^ 1 is relatively more e cient if var(^ 1)