12 mars 2021 de 15 h 30 à 16 h 30 (heure de Montréal/HNE)
Colloque par Jay Breidt (Colorado State University, USA)
Informative selection, in which the distribution of response variables given that they are sampled is different from their distribution in the population, is pervasive in complex surveys. Failing to take such informativeness into account can produce severe inferential errors, including biased and inconsistent estimation of population parameters. While several parametric procedures exist to test for informative selection, these methods are limited in scope and their parametric assumptions are difficult to assess. We consider two classes of nonparametric tests of informative selection. The first class is motivated by classic nonparametric two-sample tests. We compare weighted and unweighted empirical distribution functions and obtain tests for informative selection that are analogous to Kolmogorov-Smirnov and Cramer-von Mises. For the second class of tests, we adapt a kernel-based learning method that compares distributions based on their maximum mean discrepancy. The asymptotic distributions of the test statistics are established under the null hypothesis of noninformative selection. Simulation results show that our tests have power competitive with existing parametric tests in a correctly specified parametric setting, and better than those tests under model misspecification. A recreational angling application illustrates the methodology.
This is joint work with Teng Liu, Colorado State University.